Have you ever felt like the hardest part of UXR is getting to a confident answer?
Do you ever feel like the hardest part of UXR is getting to a confident answer?
HCI Today summarized the key points
- •This article explores what you can reasonably say you’ve learned when data diverges in UXR.
- •It says that confusion grows when qualitative research (qualitative research) and quantitative research (quantitative research), as well as what users say and what they do, don’t align.
- •The comments explain that it’s important to learn patterns through experience and to gain enough confidence to make the next decision—not necessarily to find the perfect answer.
- •It also suggests that research methods define the scope of questions they can answer, and that you should compare evidence by setting hypotheses first.
- •In the end, what matters isn’t finding complete truth, but creating a responsible basis for decisions by reducing uncertainty.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article precisely addresses a common UXR situation: ‘there’s plenty of data, but the conclusion is unclear.’ From an HCI perspective, it encourages us to rethink research not as a process of declaring truth, but as one of managing uncertainty to support the next decision. In particular, it connects several practical concerns—mismatches between qualitative and quantitative findings, the range of questions that the methodology can actually answer, and even how to persuade stakeholders—making it valuable for both practitioners and researchers.
CIT's Commentary
From a CIT perspective, the core of this piece is less about ‘finding the right answer’ and more about creating decisionability. In HCI research, what is observable fundamentally changes depending on the sample, the task context, and the measurement approach. So when qualitative and quantitative results conflict, it’s often not a failure in itself, but a strong possibility that they are providing evidence from different layers of the question. What matters is to explicitly state competing hypotheses at the study design stage and define in advance which evidence justifies which decision. This makes interpretation look less subjective and also enables you to communicate it to the team in a more accountable way.
Questions to Consider While Reading
- Q.When qualitative and quantitative data seem to say different things, what criteria do you use to balance ‘doing more research’ versus ‘making a decision’?
- Q.When you set competing hypotheses in advance during the study design phase, how do you structure the questions in real day-to-day work?
- Q.When stakeholders don’t trust the research results—or repeatedly ignore them—how can an HCI researcher increase persuasiveness with stronger evidence and better ways of presenting it?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.