Meta's smart glasses can now find your parked car
Meta has introduced a range of new features to its Ray-Ban smart glasses, including the ability to locate your parked car. The update, which is currently available for users in the US and Canada, was announced by CTO Andrew Bosworth on Threads. It includes enhanced natural language recognition capabilities that eliminate the need for specific command phrases like "Hey Meta, look and tell me." Now, users can interact with the AI assistant using more conversational language.
Zuckerberg demos car finding feature in Instagram reel
In a practical demonstration of the new features, CEO Mark Zuckerberg used the reminder function to locate his car in a parking garage. This was showcased in an Instagram reel, highlighting the real-world utility of these enhancements. The update aims to make Meta's smart glasses more user-friendly and versatile in everyday situations like navigating large parking areas.
Additional AI tools added to Meta's smart glasses
The latest update also includes several other AI tools that were previewed at the recent Connect event. These new features encompass voice messages, timers, and reminders. The smart glasses can now be used to instruct Meta AI to dial a phone number or scan a QR code. However, the live translation feature announced earlier is not part of this update, with no specific timeline provided by Bosworth for its release.