Meta’s Ray-Ban Smart Glasses Receive Live AI and Translation Enhancements
Meta has recently launched a substantial software update for its Ray-Ban Meta smart glasses. Rolled out in mid-December 2024, the update brings advanced AI-powered video capabilities, real-time language translation, and Shazam integration to the table.
The v11 software update equips the glasses with the ability to process visual data continuously. This enables real-time responses via a new “Live AI” feature. Users can now engage in natural conversations with Meta AI. The AI has the ability to see what the wearer sees and respond without the need for the “Hey Meta” wake word for each interaction.
The live translation feature paves the way for seamless conversations across English, French, Italian, and Spanish in real time. Users hear translated audio through the glasses’ open-ear speakers when speaking to someone in one of these languages. Meanwhile, their conversation partner can view a translated transcript on their phone. The feature remains functional even in airplane mode, provided the language packs are downloaded in advance.
Meta has also broadened its music integration partnerships with Spotify, Amazon Music, Audible, and iHeart. This allows for voice-controlled content discovery and playback directly through the glasses.
Source: Digitopia
