Ray-Ban Meta smart glasses now come with multimodal AI
Ray-Ban Meta smart glasses, known for their photo-taking, video-making, livestreaming, and headphone-replacing features, have now integrated multimodal AI. This advanced technology was added after initial testing that began in January. Meta's head of generative AI, Ahmad Al-Dahle announced on X, "Multimodal Meta AI is rolling out widely on Ray-Ban Meta starting today! It's a huge advancement for wearables & makes using AI more interactive & intuitive."
What is multimodal AI?
Multimodal AI is a complex technology that enables an AI assistant to process various types of data such as images, videos, text, and sound. This allows the AI to understand the user's environment in real-time. The glasses are equipped with a camera and five microphones that can act as the sensory organs for the AI. Users can ask the glasses to interpret anything they see, enhancing their interaction with their surroundings.
Potential applications of the smart glasses
The smart glasses can recognize a dog's breed or translate signs in foreign languages, among other things. They also have potential applications like identifying scattered ingredients on a kitchen countertop, and suggesting an appropriate recipe. However, several weeks of practical testing are needed to determine the true capabilities of this technology. Real-time translation could be particularly useful for travelers.
Hands-free video call compatibility with WhatsApp
Alongside multimodal AI, Meta has introduced hands-free video call compatibility with WhatsApp and Messenger, for the smart glasses. New frame designs have also been unveiled to cater to fashion-conscious users. These stylish frames can accommodate prescription lenses and are currently available for pre-order. The Ray-Ban Meta smart glasses, now improved with multimodal AI, are priced from $300 (around ₹25,000).
Multimodal AI beta testing has shown potential
The introduction of multimodal AI into the Ray-Ban Meta Smart Glasses, coincides with the recent launch of Humane's AI Pin, which received negative feedback due to a universally poor user experience. Early access testing of the multimodal AI beta on the Ray-Ban Meta Smart Glasses, however, has shown potential. Notably, inconsistencies in the AI's accuracy have been seen, such as misidentifying certain car models and plant species.