Sign language is a great method for the deaf community to communicate, but unless everyone around you is fluent in the visual language, it won’t be much help for communicating with the non-disabled. You could always develop a system of lip reading and writing your answer on a notepad or something, but then the years you spent learning sign language are all for naught.
Recently shown at the Meeting Of The Minds expo at Carnegie Mellon University Center, a prototype gadget called the HandTalk was displayed for the first time. The HandTalk is a wearable glove device which can detect motion and gestures done when using sign language, and translate these movements into audio played through the users cellphone or other mobile device.
The HandTalk runs in correlation with the Gesture Recognition Platform for Deaf Users, developed by Bhargav Bhat, Hemant Sikaria and Priya Narasimhan. The platform was recently coded into a special mobile software application which can so far only detect around 32 words. But they hope to provide the software with a vast vocabulary soon.
The HandTalk glove works using a built-in sensor along with flexor strips on each digit. A chip detects the precise position of each digit in relation to the other and translates the positioning into words. The design team plans on integrating pressure sensors and accelerometers to help expand the devices vocabulary.
The device is apparently pretty inexpensive to produce. The designers of HandTalk plan to have more extensive testing for the device once the glove is expanded with the needed technology, which should be pretty soon.