MotionSavvy’s UNI tablet can interpret sign language

MotionSavvy's UNI tablet can interpret sign language

California-based MotionSavvy has developed a groundbreaking technology that combines the latest in motion sensing with mobile computing to create a sign language interpreter.

The tablet computer called UNI comes in a specially-designed smart case, relying on a couple of cameras and Leap Motion hardware to track the location of user’s hands and fingers. Collected information is sent to an accompanying app that translates the hand and finger movements of sign language into audible speech or text displayed on the screen. The application also has the capability to translate spoken word into written text for the deaf person to read.

Sign language consists of various “dialects” and “accents,” and UNI can even learn from users their own style of sign language while building up its database.

MotionSavvy is looking to have UNI completely compliant with the Americans With Disabilities Act to the benefit of the deaf community, so they can freely focus on casual, spontaneous conversations. Going forward, and with the improvement of front-facing cameras in modern phones and tablets, the company plans to bring the UNI software to more devices and ultimately make lives easier for people with hearing disabilities.