
Meta has announced the rollout of three new features for its Ray-Ban smart glasses: live AI, live translations, and integration with music recognition app Shazam. However, the availability of these features is limited by user program membership.
Live AI functionality allows users to naturally converse with Meta’s AI assistant in real-time while it simultaneously views their surroundings. For instance, a user could ask the AI for recipe suggestions based on ingredients they are viewing at a grocery store, as Meta initially showcased during its Meta Connect 2024 event earlier this year. According to the company, the live AI feature will be available for roughly 30 minutes of continuous use on a single charge.
In addition to the live AI capabilities, Meta’s smart glasses also gain the ability to provide real-time language translations between English and Spanish, French, or Italian. Users will have the option to either hear translated speech through the glasses themselves or view transcripts on their mobile device. Prior to using the translation feature, users must first download relevant language pairs and specify which languages they speak versus those of their conversation partner.
On the other hand, Shazam integration is a more straightforward addition. Users can simply prompt Meta’s AI when hearing a song, after which it should be able to identify the track playing. The feature was demonstrated by Meta CEO Mark Zuckerberg in an Instagram reel.
In order for users to access these new features, they must ensure that their smart glasses are running software version v11 and also have installed app version v196 of the Meta View. Users who are not yet members of the Early Access Program can apply through a designated website.
The announcement comes as major tech companies like Google push AI assistants as the core selling point for smart glasses. Just last week, Google unveiled Android XR, a new OS for smart glasses, with its Gemini AI assistant positioned as the key feature. Additionally, Meta’s CTO Andrew Bosworth recently published a blog post emphasizing that “2024 was the year AI glasses hit their stride.”
Source: http://www.theverge.com