media collage creative
Screen Shot 2018-11-27 at 1.17.00 AM.png

This project was created during the Seattle AR/VR Hackathon in the fall of 2018.

Over the course of 48 hours, our small team conducted research, identified a problem, designed a solution, and crafted an award-winning proof-of-concept using the Magic Leap One augmented reality headset.

When Ian Page-Echols brought the initial idea of addressing hearing loss, it resonated with me personally because I’ve watched elders in my family become disconnected as their sense of hearing faded. My Opa did not like to wear his hearing aid, but if I typed out the family’s dialog on my laptop in a large font, he became engaged and loved participating with the group.

Hearing loss is serious

Hearing loss is serious

Among other things, it surprised our team to find that individuals with hearing loss are twice as likely to suffer depression.

Problem and goal definition

Problem and goal definition

We defined our problem statement:

Elderly people who lose hearing late in life feel disconnected in group settings.

This helped us to define our goal:

Leverage augmented reality technology to increase understanding, participation, and connection by adding speech bubbles into a user’s field of view.

Product/story sketching

Product/story sketching

We decided relatively early to focus on telling a user’s story rather than building a working prototype. The story that we envisioned: a family member having trouble tracking a dinnertime conversation.

The family member with hearing loss cannot hear what the other two individuals are saying. Once they put the headset on, speech bubbles appear like close captioning. This allows the family member to interact at the table and provide insight into exactly what ingredient is missing from great-grandma’s chili. interaction demo

This video demonstrates the envisioned usage of our system from the perspective of an individual with profound hearing loss.

Next Steps

Next Steps

The team was honored with an award for Outstanding Novelty.

Ian Page-Echols (initial pitch, UX design, Unity development)
Nicole Eskandari (research, UX design, deck development)
Philip Kobernik (story development, UX design)

The story that we told to the testers and judges resonated. We simulated a hearing loss environment to show how disconnecting it feels, then showed how speech transcription in the field of view can help someone re-connect with the group.

This project is a proof of concept. Further research is needed to assess the needs of individuals with profound hearing loss in group settings, and the efficacy of existing solutions.

Next steps for development of a working prototype:

1. Integrate speech to text capability, with voice identification

2. Integrate facial recognition

3. From facial recognition data, use articulation of mouth to link speaker face with voice ID

4. Develop dynamic speech bubbles in accordance with existing closed captioning UX standards