Connect with us

AI

Google’s Lookout app uses AR and AI to guide the visually impaired

Another part of Google’s recent focus on accessibility.

Giveaway: Enter to win a BLUETTI Charger 1 ($399 value): Enter Here

Just a heads up, if you buy something through our links, we may get a small share of the sale. It’s one of the ways we keep the lights on here. Click here for more.

Augmented Reality (AR) apps up to now have been mostly of the Pokemon Go variety, great fun for the average user but not really all that useful. But AR can be much more than that. What if you could use AR to replace senses lost through an accident or disability?

Google has been working hard on their AI and computer vision for many years now, and they’re getting spookily good at identifying objects seen by a camera. The company is now rolling those improvements into an Android app called Lookout that can help the visually impaired understand what’s around them.

Lookout is currently designed to only work with Google’s own Pixel range of smartphones. You wear your phone around your neck on a lanyard or keep it in your shirt pocket with the rear camera visible. It’s been trained to recognize things like signage, other printed text, people, and objects such as tables and chairs. It notifies the user of any potential hazards with audio cues such as ‘chair at 3 o’ clock.

There are 253 million people worldwide with vision problems of some degree. This app should help them move around more independently and introduce a level of safety back into their lives. Google says that the core of the app’s functionality is processed on the device, so it will run without internet access. It also supports four different modes to deliver context-specific information depending on what you’re doing.

From The Next Web,

If you’re getting ready to do your daily chores you’d select “Home” and you’ll hear notifications that tell you where the couch, table or dishwasher is. It gives you an idea of where those objects are in relation to you, for example, “couch 3 o’clock” means the couch is on your right.

If you select “Work & Play” when heading into the office, it may tell you when you’re next to an elevator, or stairwell. The fourth mode, Experimental, is meant for testing out Lookout features that are still in development.

The app sounds similar to Microsoft’s Seeing AI, which was launched on iOS last year and works as a ‘talking camera,’ audibly describing things around the user as they point their camera at people and objects. However, Lookout seems like it could be more useful as its various modes can help highlight only the important objects in one’s surroundings based on what they’re doing – and therefore cut out a whole lot of noise from the app.

Lookout might work well in conjunction with something like Microsoft’s other app, SoundScape. It helps people get around and become aware of exactly where they are, by pairing with a stereo headset and calling out the names of roads and landmarks that they pass by.

The app will become available to Android users in the users later this year.

Follow us on Flipboard, Google News, or Apple News

Maker, meme-r, and unabashed geek with nearly half a decade of blogging experience at KnowTechie, SlashGear and XDA Developers. If it runs on electricity (or even if it doesn't), Joe probably has one around his office somewhere, with particular focus in gadgetry and handheld gaming. Shoot him an email at joe@knowtechie.com.

Click to comment

Leave a Reply

Your email address will not be published. Required fields are marked *

More in AI