Meta enhances its smart glasses with live AI, translation, Shazam
Meta has added three new capabilities to its Ray-Ban smart glasses: Live AI, live translations, and Shazam integration. The updates aim to expand the utility of the glasses beyond their existing use as a head-mounted camera and open-ear headphones. The features were initially revealed at Meta Connect 2024 earlier this year. To access the new features, users need to ensure their glasses are running v11 software and have v196 of the Meta View app installed.
Live AI and translations: How do they work?
The live AI feature lets you interact with Meta's AI assistant without a wake word. This gives the assistant access to your surroundings for hands-free interaction. You can ask questions or seek help while cooking or fixing something. The live translation feature provides real-time speech translation between English and Spanish, French, or Italian. For this, language pairs must be downloaded in advance and you need to specify your spoken language and that of your conversation partner.
Shazam integration and availability
The Shazam integration feature allows the smart glasses to detect songs playing in the vicinity. All you have to do is prompt the AI by saying "Meta, what is this song," and the glasses' microphones will detect the music. While Shazam support is available for all users in the US and Canada, live AI and live translation are limited to members of Meta's Early Access Program. Interested users can apply for this program via Meta's website.