Meta has unveiled a significant update to its Ray-Ban smart glasses, introducing live AI and translation capabilities that hint at the potential trajectory of augmented reality (AR) devices. As we close out 2024, these enhancements mark an intriguing evolution in smart glasses technology. Andrew Bosworth, Meta’s CTO, shares his perspective on what lies ahead, shedding light on the future possibilities of this innovative gadget.
Testing the Live AI Experience A Morning Walk Through Manhattan Activating the Live AI feature on Meta’s Ray-Ban glasses transforms them into a quasi-AI assistant, constantly observing and responding to your surroundings. During a morning walk in Manhattan, the glasses captured a live video feed, processing data via phone. A small white LED indicator signaled the feature’s activation, though most people on the street seemed to overlook it.
Engaging with the AI felt experimental. Questions about pigeons, construction, and vehicles produced mixed results, ranging from accurate answers to moments of complete silence due to connectivity issues. The feature, while innovative, occasionally stumbled—incorrectly identifying streets and objects or providing outdated recommendations. The experience oscillated between futuristic and impractical, highlighting the early-stage nature of this technology.
Turning off Live AI was also a test of patience. Commands like “Stop Live AI” were sometimes misunderstood, requiring repeated attempts. While the feature provides glimpses of a more immersive AI-driven future, its current execution feels more like a beta test than a polished product.
Translation Capabilities: Limited Yet Promising Meta’s live translation feature is another noteworthy addition, enabling real-time language conversion. However, its utility is limited to a few languages, including Spanish, French, Italian, and English. During a test conversation in Astor Place, the AI successfully facilitated a bilingual exchange, though it struggled with idiomatic expressions and some nuanced phrases. Despite these limitations, the feature offered enough clarity to maintain a meaningful conversation.
The Vision for Future AR Glasses The current iteration of Ray-Bans provides a glimpse into Meta’s ambitious roadmap. Future versions could include integrated displays and gesture-recognition capabilities, as hinted by recent comments from Bosworth and Meta’s ongoing development of their Orion project. These advanced glasses might incorporate heads-up displays (HUDs) for real-time information and neural input bands for intuitive controls.
Meta is also exploring gesture-based controls that could involve downward-facing cameras and illumination. While this technology could potentially be adapted to existing Ray-Ban models, Bosworth emphasizes that a seamless experience would require integrated displays. An electromagnetic (EMG) band, currently in development, could complement these features, offering a new level of interaction but raising concerns about cost, weight, and battery life.
Challenges Ahead: Battery Life and Usability One of the major hurdles for always-on smart glasses remains battery life. Live AI currently reduces the glasses’ runtime from hours to mere minutes, posing a significant challenge for future iterations. Additionally, the lack of precise input mechanisms, like the ability to point at objects, often limits the AI’s effectiveness and usability.
Integration with Meta’s Broader Ecosystem Looking beyond the glasses, Meta envisions a seamless integration between its AI systems in AR and VR. Bosworth suggests that the same AI tracking your steps in the real world could also assist with VR workouts, creating a unified ecosystem of smart assistance.