Do humans share feelings with robots? A new study suggests they do.

A new study released by researchers in Japan reveals that humans may exhibit similar levels of immediate empathy towards robots in pain as they do towards humans. 

Courtesy of Toyohashi University of Technology
A human and a robot hold hands.

Human empathy may extend further than we thought. In fact, humans may even have the ability to empathize with manufactured machines.

Researchers at Toyohashi University of Technology in collaboration with Kyoto University, released a study Tuesday revealing that when shown images of human and humanoid robotic hands in painful situations, humans responded with similar immediate levels of empathy, as evidenced by recorded electrical activity of the brain.

“I think a future society including humans and robots should be good if humans and robots are prosocial,” study co-author Michiteru Kitazaki told Inverse. “Empathy with robots as well as other humans may facilitate prosocial behaviors. Robots that help us or interact with us should be empathized by humans.”

However, humans did still respond with stronger levels of extended empathy towards humans than robots.

This could be “caused by humans' inability in taking a robot's perspective,” the researchers say. “It is reasonable that we cannot take the perspective of robots because their body and mind (if it exists) are very different from ours.”

While these results represent the first neurophysiological evidence that humans are able to identify with the perceived pain of robots, studies of human-robot empathy are not entirely new.

In 2013, two studies were conducted by German researchers from the University of Duisburg-Essen measuring human empathy levels for robots by varying methods.

The first study measured skin conductance levels of volunteers while watching videos of a dinosaur robot being treated affectionately or abusively.

“When a person is experiencing strong emotions, they sweat more, increasing skin conductance,” explains Live Science. “The volunteers reported feeling more negative emotions while watching the robot being abused. Meanwhile, the volunteers' skin conductance levels increased, showing they were more distressed.”

In the second study, researchers visualized volunteers’ brain activity while volunteers watched subsequent videos of a human and then a robot being strangled by a plastic bag. Lead study author Astrid Rosenthal-von der Pütten concluded, "in general, the robot stimuli elicit the same emotional processing as the human stimuli."

As robots are further introduced into human life, it becomes increasingly important to understand human-robot interaction, a phenomena that may be related to the humanoid nature of robots.

Over 40 years ago, Masahiro Mori developed the "uncanny valley" theory which suggested a “person's response to a humanlike robot would abruptly shift from empathy to revulsion as it approached, but failed to attain, a lifelike appearance.” This “descent into eeriness” is what he called the uncanny valley.

Mr. Mori’s theory may be disproved in the near future as researchers continue to work towards the development of human-friendly robots that humans can identify with. But while this advance in artificial intelligence suggests human potential to empathize with robots, it’s unfair to assume that robots have the ability to empathize with humans just yet.

“True empathy assumes a significant overlap in experience between the subject of the empathy and the empathizer,” says Skye McDonald, writer for Phys.org. “We are still a long way from fully understanding the complexities of how human empathy operates, so are still far from being able to simulate it in the machines we live with.”

You've read  of  free articles. Subscribe to continue.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.