Assessing the Feasibility of Augmented Reality to Support Communication Access for Deaf Students in Experiential Higher Education Contexts
Evaluating the Feasibility of Augmented Reality to Support Communication Access for Deaf Students in Experiential Higher Education Contexts
HCI Today summarized the key points
- •This study examines how practical AR is for supporting communication in lab-based classes for Deaf students.
- •The research team used AR smart glasses to display interpretation and captions directly in front of the students’ eyes, aiming to reduce gaze shifts compared with conventional screens.
- •Results from an experiment with 12 Deaf students suggested that AR tends to help with focus and mobility during activities that require using the hands.
- •However, there were also inconveniences such as the glasses’ weight, eye fatigue, and conflicts with hearing aids or cochlear implants—so it was not a solution for every situation.
- •Overall, AR can be useful in lab settings, but it needs to be designed to be lighter and more stable to fit classrooms and equipment requirements.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article treats AR not as a mere novelty technology, but as an interaction problem: where information should be placed so that it remains safe and learning does not get interrupted. In particular, it clearly illustrates the concrete tensions among distraction, mobility, and safety that Deaf students face in busy, hands-on environments like laboratories—where both gaze and hands are constantly engaged. This is highly meaningful for HCI/UX practitioners and researchers. It pushes you to think beyond ‘good performance’ and instead ask whether it can actually be used in the field.
CIT's Commentary
The core strength of this paper is that it makes it clear AR is not a one-size-fits-all solution. In a lab setting, AR can be useful because it reduces gaze switching and can attach information even while moving. However, in a classroom setting—where people often need to read long text and then look back—tablets may be a better fit. In other words, the interface should change according to the context, not just the technology itself. Also, display methods that move along with the head can conflict with natural behaviors like nodding, potentially causing discomfort—highlighting again why state transparency and user intervention paths are important in safety-critical systems. An interesting point is that the study offers implications not only for product design, but also for research methods. To evaluate the impact of AR adoption, you need to measure more than satisfaction—such as gaze switching, when users intervene, and failure modes. It’s also worth exploring ways to use tools like LLMs to support UX measurement and interview coding.
Questions to Consider While Reading
- Q.In contexts with very different demands—such as labs, classrooms, and remote environments—how should AR, tablets, and human support be combined to be the safest and least tiring?
- Q.What interface design is needed to present information stably within the user’s gaze without relying on head-fixed displays?
- Q.What research tools and logging designs are most effective for continuously measuring indicators like gaze distraction and cognitive load in real services?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.