HOW A HORSE WHISPERER CAN HELP ENGINEERS BUILD BETTER ROBOTS

Dr. Eakta Jain’s research on the human/horse interaction asks if this can teach us something about building robots designed to improve our lives.

She states “There are no fundamental guiding principles for how to build an effective working relationship between robots and humans, as we work to improve how humans interact with autonomous vehicles and other forms of AI, it occurred to me that we’ve done this before with horses. This relationship has existed for millennia but was never leveraged to provide insights for human-robot interaction.”

Like horses did thousands of years before, robots are entering our lives and workplaces as companions and teammates. They vacuum our floors, help educate and entertain our children, and studies are showing that social robots can be effective therapy tools to help improve mental and physical health. Increasingly, robots are found in factories and warehouses, working collaboratively with human workers and sometimes even called co-bots. 

As a member of the UF Transportation Institute, Jain was leading the human factor subgroup that examines how humans should interact with autonomous vehicles, or AVs. 

A thematic analysis of Jain’s notes resulted in findings and design guidelines that can be applied by human-robot interaction researchers and designers.

“Some of the findings are concrete and easy to visualize, while others are more abstract,” she says. “For example, we learned that a horse speaks with its body. You can see its ears pointing to where something caught its attention. We could build in similar types of nonverbal expressions in our robots, like ears that point when there is a knock on the door or something visual in the car when there’s a pedestrian on that side of the street.”

Dr. Jain’s findings will be presented at the upcoming ACM Conference on Human Factors in Computing Systems in Hamburg, Germany.

Article originally published on April 24, 2023 in UF News