Meta’s Ray-Ban Smart Glasses Get a Boost: Live AI and Real-Time Translation

Meta has unveiled a significant v11 software update for its Ray-Ban Meta smart glasses, pushing the boundaries of wearable technology with three groundbreaking features. The update introduces Live AI, real-time translation, and Shazam integration.

The most notable addition, Live AI, offers users the ability to engage in ongoing conversations with Meta’s AI assistant while the glasses process video in real-time. This feature eliminates the need for wake words, allowing users to ask questions about their environment seamlessly. Moreover, the AI can refer back to earlier parts of the conversation for context.

Complementing this is the live translation feature, which facilitates real-time speech translation between English and Spanish, French, or Italian. The translated audio is then delivered directly through the glasses’ speakers, creating an immersive language experience.

Music lovers also have something to cheer about. With a simple command, “Hey Meta, Shazam this song,” users can now identify music. This feature is available to all users in the US and Canada.

However, it’s worth noting that the Live AI and translation features are currently exclusive to members of the Early Access program. Meta has also cautioned that these innovative features are still under development and may not always deliver accurate results.

Source: TechCrunch

Move to the category:

Leave a Reply

Your email address will not be published. Required fields are marked *