AI-enabled Smart Glasses help the blind and visually impaired

Tech works with iOS, Android and can be integrated into Google Glass

March 15, 2022

2 Min Read

Tech works with iOS, Android and can be integrated into Google Glass

Assistive technology company Envision has unveiled its latest version of Smart Glasses to help those with no or low vision see better with the power of AI.

Showcased at a recent conference from the California State University Northridge (CSUN), the Smart Glasses uses AI to organize different types of information from visual cues and verbally translates the information to the user. The wearable tech reads documents aloud, identifies acquaintances, finds missing items in the house, and helps the wearer use public transportation.

The latest version is an enhanced model of the eyeglasses that were debuted at the 2020 CSUN conference. Since then, Smart Glasses have been rolled out globally, trialed in over 20 countries.

“By analyzing real time user data and direct feedback from across our communities, we can constantly enrich the Envision experience and innovate our products," said Karthik Kannan, co-founder of Envision.

The improved version incorporates several features with enhanced functionalities.

  • Accurate text reading: Smart Glasses can read and translate digital and handwritten texts from various sources, including computer screens, posters, barcodes, timetables, and food packaging, barcodes.

  • Optimized Optical Character Recognition (OCR): It uses tens of millions of data points to be interpreted by Envision Glasses and Apps for accurate image capture.

  • Third-Party App Integration: Envision created an app ecosystem, making it easier for its software to integrate with external services, such as outdoor and indoor navigation. It can also recognize over 100 currencies with the Cash Reader app.

  • Ally function: A secure video calling capability allows users to ask for help from contacts, using both Wi-Fi and mobile networks.

  • Language Capabilities: Four new Asian languages were added, bringing the total number of supported languages to 60 when connected. There are 26 supported languages when offline.

  • Layout Detection: Smart Glasses can contextualize a document, making it less confusing for the user to read a food menu, newspaper, road sign, or a poster.

Envison’s tech is compatible with Android and iOS systems, and can be integrated into Google Glass.

"Our mission is to improve the lives of the world’s two billion people who are blind or visually impaired by providing them with life-changing assistive technologies, products and services,” said Kannan.

This article was written by Helen Hwang, a freelance reporter for AI Business.

Get the newsletter
From automation advancements to policy announcements, stay ahead of the curve with the bi-weekly AI Business newsletter.