Designing for Deaf Accessibility in VR

Designing for Deaf Accessibility in VR

Designing for Deaf Accessibility in VR

Accessibility Research / VR & Hardware Prototyping

This research explored how we might bring American Sign Language (ASL) into virtual reality in a way that feels natural, clear, and usable—especially for Deaf and Hard of Hearing (DHH) users. I led a study testing how people understood ASL when viewed through 360° videos recorded from different body-mounted camera positions: head, shoulder, and chest. An 83% success rate, along with users’ strong preference for signing over text, points to a real opportunity to shape more inclusive, embodied communication experiences in VR.

This study contributes to a still-growing body of accessibility research in immersive tech—a space that remains critically under-explored. For me, it was a exercise in designing with accessibility at the center, and it continues to inform how I think about inclusivity in emerging technologies.

OUTCOME

83.3% Overall Success Rate

83.3% Overall Success Rate

WHAT I DID

Experimental Research / TLX Survey / Qualitative Research/ Synthesis / Hardware & Software Prototype

TEAM

Dr. Roshan Peiris / Ziming Li / Dr. Tae Oh

DURATION

June '24 - Present

Person Signing ASL words to a participant

Participant interpreting ASL words

VR Application