Sign languages like spoken languages exist in different forms with over 300 variations and are used around the globe. The World Health Organization estimates that by 2050 there will be at minimum 700 million people that will require care due to some degree of hearing loss. Families of deaf patients can establish effective ways of communication with them however, there are often barriers for others in terms of understanding and responding to their needs. There is a growing demand in the current market for ways to enable deaf people to communicate daily, improve their social life and for some make them more confident.
The backbone of the product began forming from an experiment during Lowlands Visual Storytelling 2019 lead by the Creative Intelligence Lab (from Leiden University). In the communication game, the gestural signal was tracked using Microsoft Kinect producing large amount of data. Using sign language experimental analysis and training machine learning algorithms, SLAP could provide technical services of sign language translation engine to developers and enterprises around the world that provide functional applications and service for improving the conveniences and providing recreational activities for deaf people. We believe that this product can bring down the barriers of communication.
Our customer was Tessa Verhoef. She is a assistant professor at the Media Technology program. She is doing research in the field of sign language and how people communicate via gestures instead of words. We had contact with Tessa every two weeks, so that we could keep her up-to-date with our progress and so she could give us feedback. Most of the time her feedback was very positive, as the progress we made was exactly as she wanted. The requirements for our project were very clear, because Tessa had a very thorough idea of what she wanted from us. This made it possible for us to work fast and efficiently.