Robot theory of mind: a new study suggests that robots can learn empathy

A new study describes a robot that can predict how another robot will behave, a first step in developing so-called Theory of Mind.

A new study describes how researchers have designed a robot that can predict how another robot will behave, a first step in developing so-called “theory of mind.”

The study appeared on January 11 in the journal Scientific Reports.

The research took place at Columbia University’s Creative Machines Lab.

A robot theory of mind test

The researchers believe the experiments described in the current study are the first steps in imbuing robots with what cognitive scientists call “theory of mind.” At about age three, children begin to understand that other people have different goals, needs, and perspectives.

When children develop theory of mind, it can lead to playful activities such as hide and seek.

But it also enables more sophisticated behaviors, like lying.

Researchers consider theory of mind a key distinguishing hallmark of human and primate cognition.

It is essential for complex social interactions like cooperation, competition, empathy, and deception.

Empathy as a form of anticipating others’ behavior

When humans live together for a long time, they quickly learn to predict the near-term actions of people in their immediate environment.

This ability to anticipate the actions of others makes it easier to live and work together.

But in robots, this ability has remained conspicuously absent, and that is why this study took place.

It forms part of a broader effort to give robots the ability to understand and anticipate the behaviors and intentions of others, purely from visual observations.

To conduct the study, team first built a small robot and placed it in a playpen.

They programmed the robot to look for and approach any green circles it could see.

But sometimes those green circles were blocked from the robot’s view.

A second robot observed this behavior for two hours, and began to anticipate the other robot’s intention.

This “observing robot” was eventually able to predict its partner’s goal and path 98 times out of 100, even though it was never explicitly told about the other robot’s visual handicap.

A primitive form of empathy

“Our findings begin to demonstrate how robots can see the world from another robot’s perspective,” said lead author Boyuan Chen.

Chean also said this ability is “perhaps a primitive form of empathy.”

This ability will make robots more useful.

But when robots can anticipate how humans think, they may also learn to manipulate those thoughts.

“We recognize that robots aren’t going to remain passive instruction-following machines for long,” said the laboratory’s leader Hod Lipson. “Like other forms of advanced AI, we hope that policymakers can help keep this kind of technology in check, so that we can all benefit,” he said.


Study: “Visual Behavior Modelling for Robotic Theory of Mind”
Authors: Boyuan Chen, Carl Vondrick, and Hod Lipson
Published in: Scientific Reports.
Publication date: January 11, 2021
DOI: 10.1038/s41598-020-77918-x
Photo: by Andy Kelly via Unsplash