Bridging the Interpretation Gap in Accessibility Testing: Empathetic and Legally Aware Bug Report Generation via Large Language Models
Bridging the Interpretation Gap in Accessibility Testing: Empathetic and Legal-Aware Bug Report Generation via Large Language Models
HCI Today summarized the key points
- •This article introduces the HEAR framework, which turns mobile accessibility test results into narratives from both user and legal perspectives.
- •Even when existing accessibility tools find violations well, they often provide only low-level outputs like JSON logs, making it difficult for non-experts to understand the meaning.
- •HEAR explains the real harm caused by problems through UI-context reconstruction and visual evidence, persona injection reflecting disability characteristics, and multi-layer causal reasoning.
- •In experiments analyzing 103 cases across four Android apps, the generated reports maintained factuality and consistency while increasing empathy, urgency, and awareness of legal risk.
- •However, since HEAR can inherit errors from the detection tools, it is more appropriate to view it as a complementary layer that helps with understanding and prioritization rather than as a replacement for technical logs.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This is worth reading because it shows how HCI can intervene not in the ‘detection’ of accessibility issues, but in their ‘interpretation’ and ‘prioritization.’ Turning technical logs into an easy-to-understand narrative is useful for reducing perception gaps among PMs, designers, and developers, and it has direct relevance to UX practice because it surfaces both real user harm and legal risk. The especially interesting part is placing LLMs as an intermediate layer—an example of connecting automation with persuasive communication.
CIT's Commentary
From a CIT perspective, this research can be seen as a ‘translator’ for accessibility bug reports. Even when existing automation tools extract strong clues centered on coordinates and code, that information often does not translate into action for decision-makers. HEAR reduces this gap by reconstructing the situation—combining contextual restoration, ability-based personas, and legal context—into a single, persuasive narrative. However, what matters here is that personas should not be used merely as empathy triggers. From an HCI standpoint, we need to manage the balance between emotional persuasiveness and factual consistency, as well as the risk of reducing specific disability experiences to fixed stereotypes. Practically, there is significant room to expand into a dual-layer reporting approach: ‘developer-oriented logs’ plus ‘decision-maker-oriented summaries.’
Questions to Consider While Reading
- Q.Can persona injection be separated in practice from the effect of improving clarity of explanations, and can it be validated with controlled experiments?
- Q.How much does describing legal risk influence prioritization decisions within an organization, and does it differ by role within the product organization?
- Q.In the process of making accessibility reports more emotionally persuasive, what design principles help reduce stereotypes about disability experiences?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.