The transformation in action: The recipe for automating 81% of your customer service while improving CX
Transformation in action: What it takes to automate 81% of your customer service while improving CX
HCI Today summarized the key points
- •This article explains how Intercom introduced and developed Fin, its AI customer support system.
- •Since 2022, Intercom has adopted an AI-first strategy to change its support operations, and Fin now resolves more than 81% of inquiries.
- •To do this, it strengthened knowledge management and conversation design, and also created a dedicated AI support team and new roles.
- •It further improved support quality by rolling out guidance rules, task handling, and performance analytics features first—so Fin could work better from the start.
- •As a result, customer response speed and 24/7 support improved, and the support team shifted its focus from simple ticket handling to advising on customer success and what comes next.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article shows AI not as a mere chatbot feature, but as an interaction system that reshapes the structure of customer support work itself. It’s especially useful for HCI and UX practitioners because it covers not only performance outcomes, but also when humans should step in, how knowledge is accumulated, and how far agents can act. It’s a case worth reading because it connects more than just metrics—trust, conversion, and operating model changes are all tied together.
CIT's Commentary
What’s particularly interesting is that Fin’s results can’t be explained by model performance alone. Achieving an 81% resolution rate required multiple elements to work together: knowledge management, conversation design, the handoff path to human agents, and a reorganization of roles. In other words, it’s not about ‘adding’ AI—it’s about redesigning the entire workflow. Especially in services where safety matters, the more automation you introduce, the more transparent the system state must be, and users should be able to intervene at any time. There’s also strong potential to expand the operating approach—initial beta, simulation testing, and post-launch analysis—into a UX measurement framework that can be validated in real products. In the Korean context, expectations for response speed may be shorter than in global markets, messenger culture may be stronger, and operational constraints may be tighter—so even the same framework may need to be adjusted differently.
Questions to Consider While Reading
- Q.When customers use AI support, what interface elements make them feel—fastest—that a human can step in at the right moment?
- Q.Which metrics should you track alongside user experience to prevent it from turning into ‘fast but dry answers’ as you increase automated response rates?
- Q.How should success criteria for AI support be defined differently in Korea’s customer support environment compared with global services?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.