Page Nav

HIDE

News Update:

latest

Ads Place

Amazon Alexa mod turns sign language into voice commands

Amazon Alexa mod turns sign language into voice commands Amazon’s Alexa voice assistant can queue up your favorite tunes, give a digest of...

Amazon Alexa mod turns sign language into voice commands

Amazon’s Alexa voice assistant can queue up your favorite tunes, give a digest of the day’s news and weather, and recommend movies you’re likely to enjoy. But what if you’re deaf or hard of hearing? Voice recognition systems like Alexa sometimes struggle to pick up the rhythms of deaf users, which presents a challenge for the more than 466 million people around the world with disabling hearing loss.

Luckily, software developer Abhishek Singh has a hands-free solution: an app that allows Alexa to understand and respond to sign language.

“The project was a thought experiment inspired by observing a trend among companies of pushing voice-based assistants as a way to create instant, seamless interactions,” Singh told Fast Company. “If these devices are to become a central way we interact with our homes or perform tasks, then some thought needs to be given to those who cannot hear or speak. Seamless design needs to be inclusive in nature.”

It isn’t an off-the-shelf Echo, exactly. Singh’s stack consists of an Amazon Echo connected to a laptop that handles processing; a webcam that detects signs in American Sign Language (ASL); and an artificially intelligent backend that translates those signs into voice commands Alexa can understand.

Singh trained a machine learning model in Google’s TensorFlow framework â€" specifically TensorFlow.js, an open-source library that can define and run machine learning models in a web browser â€" by repeatedly gesturing in front of a webcam until it learned to distinguish between his hand, arm, and finger movements. He then hooked it up to a translation engine that listened for the Echo’s responses, transcribed them, and typed them in text.

The setup goes a good deal beyond the accessibility features built into current-generation Echo devices. Tap to Alexa, which launched this week, allows owne rs of the touchscreen-equipped Echo Show to access Alexa features by tapping on the Show’s screen. And Alexa Captioning transcribes incoming messages and Alexa’s responses on Echo devices with a screen. But neither can read sign language.

Singh’s system is purely proof-of-concept â€" it only recognizes a handful of signs at the moment. But he told The Verge that it’s relatively easy to add new vocabulary, and that he plans to open-source the code shortly.

“If these devices are to become a central way in which in interact with our homes or perform tasks then some thought needs to be given towards those who cannot hear or speak,” Singh said. “Seamless design needs to be inclusive in nature.”

Singh’s app isn’t the first to leverage computer vision to translate sign language to text. In 2013, scientists at Microsoft Research Asia used the Kinect, a motion-sensing Xbox peripheral, to recognize signs as PC inputs. And in 2017, Nvidia built a messag ing app with a deep neural network that performs ASL-to-English translation.

Source: Google News US Technology | Netizen 24 United States

No comments

Latest Articles