InvestChat: How to Communicate More Easily with Voice, Touch, and Pen in an Investment Dashboard
InvestChat: Exploring Multimodal Interaction via Natural Language, Touch, and Pen in an Investment Dashboard
HCI Today summarized the key points
- •This article introduces InvestChat, a tablet app designed to help people easily browse the stock market.
- •The app displays multiple screens together and includes an AI-powered chat feature that helps users find investment information.
- •The research team tested the app with 12 novice investors to examine their user experience.
- •Participants explored more comfortably by combining voice questions, tapping on the screen, and writing with a pen.
- •The study shows that using multiple input methods together increases participation, and that voice-based questioning is especially helpful.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article frames AI not as a tool that simply delivers answers, but as an interface that users actively explore—by touching, speaking, and iterating together. In high-stakes contexts like investment dashboards, where there is a lot of information and decisions matter, it shows how combining natural language, touch, and pen can improve understanding and engagement. For HCI/UX practitioners and researchers, this is a case that encourages thinking about multimodal design not as just adding features, but as shaping how users interact and how trust is formed.
CIT's Commentary
The core of this study is less about the raw performance of the LLM itself, and more about how users choose input modalities when, and how those modalities complement one another. In domains where outcomes are important—like investing—it feels natural to explore broadly with voice and then verify details with touch or pen, rather than forcing users into a single input method. However, in real services, the more options you provide, the more complex the interface can become, and novice users may end up confused about what to trust and where to stop. So it’s not enough to consider only the benefits of multimodality; the design must also address how clearly the system state is communicated, and how users can revise or reject interpretations proposed by the AI. This direction seems especially important in conservative, trust-critical contexts such as domestic financial services.
Questions to Consider While Reading
- Q.What interface cues did you design to help novice investors decide when to use natural language, touch, or pen?
- Q.When the LLM’s summaries or interpretations are wrong, were there sufficient intervention paths for users to easily verify, correct, or ignore them?
- Q.When a multimodal structure is introduced into real financial services, how might the balance between learning burden and exploration freedom change?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.