Project Details


AI-enabled app describes objects for the blind

Project URL:
Project Twitter: @aipoly

Organisation Twitter: @aipoly_

  • Mobile
  • Physical Computing

285 million people worldwide have low vision or blindness, a disability costing $139 billion each year in the US alone. Advances in vision technology have been fast-paced in the last 10 years, but solutions remain bulky or reliant on the cloud, which requires a good internet connection. 

Marita Chen, Alberto Rizzoli and Simon Edwardsson are founders of a Melbourne-based startup called Aipoly. They are working on the current holy grail of vision assistance – a smartphone app that works even without a signal. Developing the prototype at Singularity University in the US, the co-founders launched a working app in March 2016, with impressive results. 

Visually impaired users can use the app to point their phone at objects around them, and a voice describes the objects in real time (a choice of a male or female voice is available). The app integrates with Google Translate, which means that Aipoly can ‘speak’ in seven different languages. 

Sighted users can teach Aipoly names of new objects and help add to the app’s vocabulary. The app works with something known as a convolutional neural network (CNN), which applies deep learning through AI, so no internet connection is required. This makes Aipoly around 10 times faster than cloud solutions for image recognition, without using any bandwidth. 

Aipoly can recognise around 1,000 objects so far, and Edwardsson says the company is trying to add at least 4,000 more soon, as well as working on more complex descriptions like ‘dog near a lamp post’ instead of just ‘dog’. 

Four versions of the app have been released on the App Store, the latest one including a teaching tool. Try it via

Image courtesy of Aipoly

Last updated: 12th of October, 2016

Cookies on the Social Tech Guide

Social Tech Guide uses cookies to provide you with an enhanced user experience. To remove this message, please click continue.

Continue Learn more