Meta just announced three new features are rolling out to its Ray-Ban smart glasses: live AI, live translations, and Shazam. Both live AI and live translation are limited to members of Meta’s Early Access Program, while Shazam support is available for all users in the US and Canada.
The highly anticipated live AI feature allows users to naturally converse with Meta’s AI assistant while it continuously views their surroundings. For example, if you’re perusing the produce section at a grocery store, you’ll theoretically be able to ask Meta’s AI to suggest some recipes based on the ingredients you’re looking at. According to Meta, users will be able to use this feature for roughly 30 minutes at a time on a full charge.
On the other hand, live translation enables the glasses to translate speech in real-time between English and Spanish, French, or Italian. Users can choose to either hear translations through the glasses themselves or view transcripts on their phone. It’s worth noting that users will need to download language pairs beforehand and specify what language they speak versus what their conversation partner speaks.
Shazam support is a bit more straightforward. All you have to do is prompt the Meta AI when you hear a song, and it should be able to tell you what you’re listening to. You can watch CEO Mark Zuckerberg demo it in this Instagram reel.
If users don’t see these features yet, they’ll need to check that their glasses are running v11 software and that they’re also running v196 of the Meta View app. Those who aren’t already in the Early Access Program can apply via this website.
The updates come just as Big Tech is pushing AI assistants as the raison d’être for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, and specifically positioned its Gemini AI assistant as the killer app. Meanwhile, Meta CTO Andrew Bosworth has asserted that “2024 was the year AI glasses hit their stride.” In his blog post, Bosworth also stated that smart glasses may be the best possible form factor for a “truly AI-native device” and that they could be the first hardware category to be “completely defined by AI from the beginning.”
In related news
Source: www.theverge.com