Meta to introduce AI capabilities in Ray-Ban smart glasses
Meta is preparing to introduce artificial intelligence (AI) capabilities into its Ray-Ban smart glasses, as reported by The New York Times. These features, under testing since December, include translation services and object recognition abilities. Users can activate the smart assistant by saying "Hey Meta," followed by their command or query. The assistant responds through speakers built into the glasses' frames.
AI performance in real-world scenarios
The New York Times provided insights into the performance of Meta's AI during various activities such as grocery shopping, driving, museum visits, and zoo excursions. While the AI demonstrated proficiency in identifying pets and artwork, it struggled with recognizing zoo animals from a distance or behind cages. The glasses also failed to correctly identify an exotic fruit called cherimoya despite multiple attempts.
Multilingual AI features in Meta's smart glasses
The AI integrated into Meta's smart glasses currently supports translation in five languages: English, Spanish, Italian, French, and German. It is expected that Meta will continue to improve these features over time. As of now, these AI functionalities are only available to users in the United States through an early access waitlist.