Interactive robots should not only be passive partners, but active partners who react to horses that respond to human emotion – they say researchers at the University of Bristol.
Eais -backed interventions offer a powerful alternative to traditional treatments that talk about patients with PTSD, trauma and autism, who struggle to express and regulate emotions only with words.
The study, presented at Chi ’25: Proceedings of the 2025 Chi Conference on Human Factors in Computer Systems held in Yokohama, recommends that therapeutic robots should also present a level of autonomy rather than one -dimensional demonstrations of friendship and compliance.
Most social robots today are designed to be obedient and predictable – following the commands and prioritizing the comfort of the user.
Our research disputes this case. ”
Ellen Weir, Chief writer From Bristol’s School of Science and Engineers
At EAIS, individuals communicate with horses through body language and emotional energy. If one is tense or unmistakable, the horse resists his marks. When the person becomes calm, clean and certain, the horse responds positively. This “living mirror” phenomenon helps participants recognize and adapt their emotional states, improving both internal prosperity and social interactions.
However, EAIS requires extremely trained horses and facilities, making them expensive and inaccessible.
Ellen continued: “We found that therapeutic robots should not be passive partners but active partners, such as EAI horses.
“Just as horses respond only when a person is calm and emotionally regulated, therapeutic robots should resist engaging when users are underlined or upset, requiring emotional arrangement before they respond, these robots could not reflect the healing effect.”
This approach is able to convert robotic therapy by helping users develop self -awareness and regulation skills, as well as horses in Eais.
In addition to treatment, this concept could affect the interaction of human robots in other areas, such as training robots for developing social skills, emotional training or even stress management in workplaces.
A key question is whether robots can really reproduce – or at least complement – the emotional depth of human animal interactions. Future research should investigate how robotic behavior can promote trust, empathy and fine coordination, ensuring that these machines support emotional prosperity in a meaningful way.
Ellen added: “The next challenge is the design of robots that can interpret human emotions and respond dynamically, as horses do.
“Do we also have to look at the ethical effects of replacing the sensible animals with machines. Could a robot ever offer the same therapeutic value as a living horse? And if so, how do we ensure that these interactions remain moral, effective and emotionally authentic?”
Source:
Magazine report:
Weir, E., et al. (2025). “You can fool me, you can’t fool it!”: Self -inferior ideas from horsepower interventions to update the design of the therapeutic robot. Chi ’25: Proceedings of the Chi 2025 Conference on Human Factors in Computer Systems. Doi.org/10.1145/3706598.3714311.