Alexa, Google Assistant, Cortana: voice assistants are changing the way we shop, search, communicate or even live. At least for most people. But what about those without a voice? What about those who cannot hear? Around 466 million people worldwide have disabling hearing loss. With the SIGNS Project, we are creating awareness for digital accessibility and inclusion.
Many people with hearing loss use their hands to speak. And that’s all they need to talk to SIGNS. The prototype is an early stage technology that continues to learn more signs. It’s a smart tool that recognizes and translates German sign language in real-time and then communicates directly with a selected voice assistant (Alexa, Google Assistant or Cortana). How's the weather tomorrow? Change lights to blue. Find an Italian restaurant. Just speak, and SIGNS will answer.
Scroll to discover more
Voice assistants use natural language processing to decipher and react to audible commands. No sound means no reaction. The SIGNS prototype bridges the gap between deaf people and voice assistants.
SIGNS is based on an intelligent machine learning framework that is trained to identify body gestures with the help of an integrated camera. These gestures are converted into a data format that the selected voice assistant understands.
The voice assistant processes the data in real-time and replies appropriately back to SIGNS. The answer is then immediately either displayed in text form or via visual feedback.
SIGNS is a smart tool that works on any browser-based operating system that has an integrated camera (laptops, tablets, smartphones). SIGNS can easily be connected to voice assistant services such as Alexa, Google Assistant or Cortana.
Learn how easy you can make use of signs. Just a few simple gestures to understand. Did you know that people know more words in a foreign language then in sign language?