Microsoft has released a new smartphone app — only for iPhone users in the U.S. — that uses computer vision to describe the world for the visually impaired.
Called Seeing AI, it prompts users to point their phone’s camera at people and products to tell him/her more about them. When it’s looking at other people, it can say who they are — if they are in the user’s contacts — and how they’re feeling; or if the person can’t be identified, the app will provide the user with his/her age.
On the other hand, when the phone is pointed at a product, Seeing AI can tell the user what it is — the feature that is made possible by scanning the product’s barcode.
Also, the app can read and scan documents, and is able to recognize the US dollar, which is super-handy for the visually impaired who could have hard time differentiating between similar bills.
The best part is that all this happens on the device itself without connecting to the cloud-based service, making sure that results are delivered within a fraction of a second — whether there is a good internet connection around or not.
However, for the more powerful, experimental features — like describing an entire scene or recognizing handwriting — Seeing AI will require connecting to the cloud.
The app uses neural networks to do its magic, which is the same basic technology that’s being used to power self-driving cars, drones, and more. It takes an extra step(s) to tell the user to move the camera left or right to get the target in shot.
According to Saqib Shaikh, the tech lead on Seeing AI, the most commonly used feature for the app is for reading signs and menus. Speaking to TheVerge, he failed to mentioned when Seeing AI will be available for Android and to users in other countries. Guess we’ll have to wait a bit longer to get that information…