Now, Google's AI glasses will help the blind 'see'
A few years ago, Google drew widespread flak for integrating a camera with Glass, its smart eyewear for consumers. The project was eventually transformed into an enterprise tool. But now, the exact same camera integration, which was heavily criticized, is being leveraged to help the blind see the world around them. Here's all you need to know about it.
AI-powered version of Glass
Envision, a Dutch startup building accessibility apps for people with visual impairments has integrated its AI technology with Google Glass. The project enables the Glass to recognize images, texts, people, scenes, and objects, and speak out about them to the wearer. This way, the wearer is able to see and perceive most, if not all, things in their surroundings.
How does the AI glass work?
The AI tech integrated with Glass analyzes different images taken by the wearable to extract visual information. Then, that visual information is spoken out loud so that the wearer gets a greater understanding of the environment around him or her. This could enable a blind individual to read from a book, get to a store while understanding road signs, and select items there.
Envision claims its tech can read a number of language
Envision claims that its tech is the fastest and most accurate tool of optical character recognition (OCR) and can recognize and transcribe texts and scripts in more than 60 languages. Plus, it works on a range of surfaces, starting from food packaging, posters, and computer display screens to QR, barcodes, and regular paper with handwritten text.
Available for purchase at $1,699
Even though Google Glass lives on as an enterprise product, its tweaked AI-powered variant will be available to the consumers. It is up for pre-orders at $1,699 but will retail at $2,099 once the shipping begins in August 2020. So, if you want an assistive tech to help someone see, it's better to place the order as soon as possible.