
Meta’s Ray-Ban Smart Glasses Get AI-Powered Real-Time Visual Recognition
In a significant step forward for wearable technology, Meta has rolled out a software update for its Ray-Ban smart glasses, enabling real-time visual recognition through the integration of artificial intelligence (AI). As part of its Early Access Program, users can now access this groundbreaking feature, which promises to revolutionize the way we interact with our surroundings.
The new “live AI” capability allows Meta’s AI assistant to seamlessly integrate into your visual field. The system works by continuously feeding the view from the glasses’ camera feed into what the AI assistant tells you during interactions. According to Meta, this development enables a more natural and intuitive conversation experience than ever before. The company claims that its AI will provide useful suggestions at the right moment, essentially anticipating and responding to your needs.
But how exactly does it work? In simple terms, the 12MP camera in the latest Meta Ray-Ban generation provides a clear view of your surroundings, allowing for high-quality image analysis. While concerns may arise regarding the clarity of visual recognition in low-light environments, such as dimly lit apartments, beta testers participating in the Early Access Program will have to wait and see how this plays out.
Another notable addition is the incorporation of live translation capabilities. This innovative feature was showcased during Meta Connect 2024, with CEO Mark Zuckerberg engaging in a conversation with UFC fighter Brandon Moreno in two different languages – English and Spanish. The AI-driven translation process takes place via Meta servers, rendering translated audio through the glasses’ speakers while also being posted to the phone app.
While the technology itself is not new, its implementation in smart glasses presents a more relatable and tangible representation of live translation capabilities for everyday usage. It will be fascinating to see how this plays out in real-world scenarios, particularly regarding the naturalness and immediacy of conversations facilitated through such a feature.
Additionally, the software update brings Shazam integration to the Meta Ray-Ban smart glasses, enabling users to identify the current song playing by simply asking “Hey Meta, what is this song?” This latest development marks another significant milestone for wearable tech as it continues to blur the lines between humans and machines.
The new features are currently available exclusively in the US and Canada. To access these innovative capabilities, users must sign up for the Early Access program and obtain the v11 software update, which began rolling out on December 16th.
Source: http://www.forbes.com