How Do Chinese Primary School Students Accept a Social Robot for English Speaking Practice? Insights from the Computers Are Social Actors (CASA) Paradigm
A sequential explanatory mixed-methods study on the acceptance of a social robot for EFL speaking practice among Chinese primary school students: Insights from the Computers Are Social Actors (CASA) paradigm
HCI Today summarized the key points
- •This study investigates how much Chinese primary school students accept a social robot designed for English speaking practice.
- •Based on the Technology Acceptance Model (TAM) and the Computers Are Social Actors (CASA) perspective, the researchers examined students’ intention to use the robot.
- •By analyzing both a survey of 436 participants and interviews with 12 participants, the study identified which factors influence robot use.
- •Overall, fun and ease of use were the most important factors, while warmth, human-like appearance, and a sense of presence increased enjoyment.
- •The study suggests that for young learners to accept educational robots, it’s important not only to have strong functionality, but also to deliver emotional and social impressions.
This summary was generated by an AI editor based on HCI expert perspectives.
Why Read This from an HCI Perspective
This article looks at acceptance of educational robots not only in terms of whether they perform well, but also at how children actually enjoy them and feel comfortable using them. In HCI and UX, it’s important to recognize that emotions and perceived usability can strongly shape behavior. Especially in learning technologies, factors such as motivation, confidence, and interaction experience may have a greater impact than features alone—making this relevant to both practice and research.
CIT's Commentary
The core of this study is that the success of educational AI or robots depends less on how ‘smart’ they are and more on how well they create an experience that makes learners want to ‘talk with them.’ For children, it may matter more whether the robot feels easy to approach and responds warmly than whether it looks intelligent. This also implies that the interface can directly change learning attitudes. However, when translating these findings into real products, it’s important to consider the risk of creating disappointment or misunderstandings by raising expectations too much. Therefore, designers should provide rich social cues while clearly communicating what the system can and cannot do—making its status and limitations explicit.
Questions to Consider While Reading
- Q.Will the elements that make children perceive the robot as ‘friendly’ remain effective over long-term use, or will the initial positive reaction fade as they become accustomed to it?
- Q.Increasing social attributes may boost learning motivation, but it can also lead to misunderstandings or overconfidence—how should this balance be designed?
- Q.If we apply these results to Korean education services or kids’ AI products, where would it be more appropriate to validate them first: in school settings or in home settings?
This commentary was generated by an AI editor based on HCI expert perspectives.
Please refer to the original for accurate details.
Subscribe to Newsletter
Get the weekly HCI highlights delivered to your inbox every Friday.