
Meta’s Ray-Ban Smart Glasses Unveil Revolutionary “AI to See” Capability
In a groundbreaking move, Meta has rolled out a significant software update for its Ray-Ban smart glasses, introducing an unprecedented feature that allows the AI-powered device to see what you can see. The innovative technology, dubbed “live AI,” enables seamless integration of human and artificial intelligence, blurring the lines between reality and virtual assistance.
As part of the v11 software update, which began rolling out on December 16, users in the US and Canada can now experience the thrill of this cutting-edge innovation firsthand. However, to access these features, one must join Meta’s Early Access Program. This exclusive community is granted early access to the latest updates and developments before they become widely available.
One of the most significant aspects of this new functionality is its ability to continuously interact with users using live AI. This game-changing feature allows for a more natural and intuitive conversation experience. According to Meta, “Meta AI can see what you see continuously and converse with you more naturally than ever before.” The company further suggests that, in the future, the AI will be capable of providing suggestions even before they are requested.
The camera on the latest generation of Meta Ray-Ban glasses boasts a 12MP resolution, which is sufficient for capturing a clear view of one’s surroundings. However, it is essential to note that the device may rely on a lower-quality 1080p video feed for image analysis purposes.
While some critics may question whether the camera resolution would be sufficient for accurately identifying objects in low-light environments, such as a dimly lit apartment, beta testers participating in Meta’s Early Access Program will be able to provide invaluable feedback and help refine this technology.
Additionally, the software update also brings live translation capabilities to the table. This feature was showcased during Meta Connect 2024, where Mark Zuckerberg engaged in a conversation with UFC fighter Brandon Moreno, effortlessly translating his words from English to Spanish. The translation process is handled through Meta servers, and the translated audio is played directly through the Meta Ray-Ban glasses speakers, as well as being posted on the phone app.
While this technology may not be entirely novel, the practical application of live translation in a pair of smart glasses offers an unprecedented level of convenience for language learners or travelers. The seamless integration of AI-powered real-time translation into wearable devices opens up new possibilities for how we interact with the world around us.
Lastly, the update also incorporates Shazam integration into the Meta Ray-Ban glasses. With this feature, users can simply ask “Hey Meta, what is this song?” and instantly receive the title and artist of the currently playing track.
Source: http://www.forbes.com