Donate to Science & Enterprise

S&E on Mastodon

S&E on LinkedIn

S&E on Flipboard

Please share Science & Enterprise

Glove-App System Translates Sign Language to Speech

Sign language translation glove

Sign language translation glove and circuit board (Chen lab, UCLA)

30 June 2020. University engineers designed a glove with electronic sensors that sends signals to a smartphone app to translate American Sign Language into audible speech. Researchers from University of California in Los Angeles and Chongqing University in China describe the system and test results in yesterday’s issue of the journal Nature Electronics (paid subscription required).

The team from the lab of UCLA biomedical engineering professor Jun Chen is seeking a device for people who do not understand American Sign Language can comprehend its hand and facial gestures without a separate translator. “In addition,” adds Chen in a UCLA statement, “we hope it can help more people learn sign language themselves.”

American Sign Language, or ASL, is a complete natural language expressed by movements of the hands and face, and in North America is the primary language of people who are deaf or hard of hearing. ASL is a complete language with its own rules for pronunciation, word formation, and word order separate from English. Facial expressions and body motions supplement hand gestures to indicate, for example, asking a question rather than making a statement. Fingers and hand shapes are used to spell out words, particularly formal names, but like English in North America, regional variations in ASL emerge to accommodate dialects and slang.

Chen’s lab studies the integration of textiles and electronics for wearable systems, particularly when used in the emerging Internet of things connecting billions of sensors and devices. The sign language translator system uses a glove with sensors made of electrically conductive polymer yarns that stretch the length of the glove’s thumb and fingers. The authors say the flexible and stretchable polymers are inexpensive. Small, stretchable sensors are also added to the face between the eyebrows and near the mouth to capture facial expressions.

The sensors are wired to a printed controller circuit board worn on the wrist. The board then send the signals to a smartphone app, which translates the signals into spoken speech. The translation system is aided by a machine learning algorithm trained by wearers who are deaf that repeated each hand gesture 15 times. In tests of a proof-of-concept prototype, the glove-app system recognized 660 hand gestures representing letters, numbers, and words with 99 percent accuracy, and a response time of about one second.

A brief (17 second) video from Chen’s lab demonstrates the system. The university filed a provisional patent application on the technology with Chen and two co-authors listed as inventors. Chen notes, however, that a commercial system would need a faster response time and larger vocabulary.

More from Science & Enterprise:

*     *     *

1 comment to Glove-App System Translates Sign Language to Speech