top of page

UX/UI

CHDH - Communication and learning helper for Deaf and hearing people

Problem

How might we use the Hololens technology to help Deaf and hearing people communicate better?

 

Project developed with  Pimchanok Sripraphan and Vivian Nwoye.

My Role

User research, user journey storyboarding, prototyping, testing, 3D animation, interaction design

2017

Project scope

In a world adapted to hearing people, those with hearing loss are facing a huge communication barrier. As indicated a research made for this project with real users, Deaf people are commonly delayed in education due to the great challenges in early stages of learning. Not only in education but also in the professional life, they are constantly deprived of equal treatment and opportunities as hearing people.

The goal of this project is to assist the Deaf user to access information at young age and to be more independent when communicating in learning or working environments. Using the HoloLens technology, holographic glasses developed by Microsoft, this application enables a two-way communication by translating speech to sign language displayed in the field of vision of the user as an interactive holographic interface, and providing reply options to be translated as sound output to the hearing. It also includes a learning mode which uses image recognition to help the user to acquire new vocabulary visually.

How it works

The main user is the Deaf person who would wear the device in specific situations such as school or at home. The HoloLens is not proper for
long-hours usage because of its weight and battery capacity. This project aims in aiding the user during lectures or for learning vocabulary.

In the Communication Mode it translate speech to sign language, displaying the tranlation as a hologram in the view of the user. Then it recognizes the message and provides some answer options that will be spoken throught the glasses.

In the Learning Mode the user can learn and save new words in english and sign language. Using the image recognition technology, it recognizes objects throught the camera and display its name in sign language and english. To assure that the new word is accurate, it searches and shows similar images on the web.

fig D.1 - Holographic interface.

Development process

Our team followed the Human-Centred Design process model of the ISO 9241-210, trying to include the user in as many steps as possible. 

We conducted interviews with a Deaf community member and Deaf and Hard-of-hearing people from Canada to acquire information on the user. We also made only surveys to have a broader range of participants.

After analysing the colecting data and build the user and system requirements, we developed personas, workflow, use cases and storyboards to help us visualize the product functions.

To start designing we made some brainstorming and braindrawing sessions. We built a paper prototype with a plastic sheet attached to googles to simulate the usage of the Hololens. With that, we conducted two iterations of rapid usability tests.

In sequence we wanted to validade our product with Deaf users but unfornutanely we did not have physical access to any. For this reason we developed a demonstration video that simulates the usage of the app and sent online to the users in Canada to receive feedback. 

Simulation video of the results

bottom of page