
Meta has just announced the rollout of three new features for its Ray-Ban Meta Smart Glasses: live AI, live translations, and Shazam. This update is part of the company’s efforts to further enhance the capabilities of its wearable technology.
The live AI feature allows users to naturally converse with Meta’s AI assistant while it continuously views their surroundings. This means that consumers can ask the AI to suggest recipes based on ingredients they’re looking at in a grocery store, for example. The company claims that users will be able to use this feature for roughly 30 minutes on a single charge.
In addition to live AI, Meta has also rolled out live translation capabilities. These translations are real-time and available in English, Spanish, French, or Italian. Users can choose to hear the translations through their glasses or view transcripts on their phone. However, it’s worth noting that language pairs must be downloaded beforehand, and users must specify the languages they speak and those of their conversation partner.
Shazam support is also now available for all US and Canadian users. With this feature, consumers can simply prompt the Meta AI to identify a song they’re listening to. This capability was demoed by Meta CEO Mark Zuckerberg in an Instagram reel.
To access these new features, users will need to ensure their glasses are running on the v11 software and that they have installed the most recent version of the Meta View app (v196). Those who are not already part of the Early Access Program can apply via this website.
The rollout comes at a time when big tech companies like Google are pushing AI assistants as the core value proposition for smart glasses. Just last week, Google announced Android XR, a new OS for smart glasses, and positioned its Gemini AI assistant as the killer app. Meanwhile, Meta’s CTO Andrew Bosworth has posted a blog, stating that 2024 was the year AI glasses hit their stride.
In his post, Bosworth also suggests that smart glasses may be the best possible form factor for a “truly AI-native device” and the first hardware category to be “completely defined by AI from the beginning.”
Source: http://www.theverge.com