Smells Like Fire: Avoiding Flames with Scent Cues in VR Evacuation Training
Smells Like Fire: Exploring the Impact of Olfactory Cues in VR Wildfire Evacuation Training
HCI Today summarized the key points
- •This article summarizes a study examining how adding smoke odors to VR training can help improve evacuation preparedness.
- •The researchers divided 18 participants into two groups: one group played a VR game with smoke odors, while the other completed the same task without any scent.
- •As a result, the group that smelled the smoke reported feeling more immersed, seeing the screen as more realistic, and understanding the evacuation situation better.
- •In both groups, after the experiment, participants’ confidence and knowledge improved—specifically regarding wildfire evacuation preparedness, what items they should bring, and their ability to find those items.
- •The study suggests that adding sensory cues like scent to VR can make disaster preparedness education more vivid and potentially improve learning outcomes.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article shows how VR can go beyond being a mere immersive screen and become an interaction tool that helps people rehearse real actions. In particular, it’s useful for HCI/UX practitioners designing disaster education or high-risk training to examine how sensory stimuli—such as scent—affect users’ sense of immersion and readiness. The key isn’t the technology itself, but how users feel, what they learn, and how that learning transfers to real situations.
CIT's Commentary
What’s interesting about this study is that it doesn’t stop at the idea that ‘adding scent makes it more realistic’; it asks what kinds of actual behavioral changes that realism can produce. However, multi-sensory design may increase immersion while also potentially heightening anxiety or fatigue, so it’s important to evaluate the balance between benefits and burdens. Also, if the sample size is small and focused on college students, results may differ in broader environments—such as the Korean disaster-preparedness context—where age ranges and digital proficiency vary widely. Ultimately, what matters is less how strongly you add the stimulus and more how the system is designed so users understand the situation at the right time and can intervene when appropriate. For AI/VR systems used in disaster training, it’s especially important that the system state is clear and that paths for stopping, retrying, and getting explanations are easy.
Questions to Consider While Reading
- Q.If scent cues increase immersion, how can we measure anxiety or avoidance responses and design them in tandem?
- Q.In disaster training, which interface elements are most important for helping users understand what they are learning ‘right now’?
- Q.When this kind of VR training is introduced into contexts like Naver, Kakao, or domestic public services, what user groups and device conditions should be reconsidered?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.