Part of brain that triggers 'uncanny valley' unease pinpointed
- 11 January, 2018 11:37
Credit: Osaka University
When robots are made to look more human-like, we become more empathetic towards them. Make them too realistic, however, and those positive feelings quickly turn to revulsion and spark similar reactions to those we would have to seeing a corpse.
The bad vibes we feel towards an android’s not-quite-human appearance and movements is called the ‘uncanny valley’, a concept first identified in the 1970s.
The effect of a robot’s movements on this response has not been well explored, but the limited research has suggested it arises from a mismatch between the expected natural movement and an android’s actual jerky motions.
Researchers at Osaka University in Japan have now pinpointed the area of the brain triggered by uncanny robot movements by MRI scanning observers of an android and its human model. The results could not only help reduce revulsion to robots, but may help in the understanding of Parkinson’s disease.
The researchers filmed an android developed by Osaka University and Advanced Telecommunications Research Institute International called Geminoid F making 36 facial expressions. They also filmed the real human on which the android was modelled making the same expressions.
“Although the android is almost comparable to a human in appearance and movement, its movement is still slightly jerky and unnatural, due to the limitations of its joint system and actuators,” said Takashi Ikeda, lead author of the study, published last month in Scientific Reports.
The two films were shown to participants while in a MRI scanner.
“Visual observation of the android, compared with that of the human model, caused greater activation in the right subthalamic nucleus (STN), which plays an important role in motor control,” Ikeda added.
The STN plays an important role in motor control, by making our movements smooth and fluid. It is also responsible for monitoring and evaluating errors of movement.
The researchers suggest that it monitors the naturalness of movement through visual feedback.
“Thus, when an android moves in an unnatural manner, the STN may detect this unnaturalness because an error signal results from a mismatch between a visual input and an internal model for smooth movement,” the researchers noted.
In Parkinson’s disease patients the STN is affected, giving some sufferers rigid movements.
“Our study attests to commonalities between the movements of the android and a Parkinson’s disease patient,” the researchers wrote. “The android's movements were rigid and akinesic in a comparable way to the movements of a patient with mild Parkinson's disease.”
The results may help in “elucidating the pathology” of the disease, the researchers added.
Human-like androids are already in use in medical and healthcare settings – one is currently employed in a pilot study to run mock job interviews with patients with autism spectrum disorder.
It is important in such use cases that users aren’t repulsed by their robot assistants.
“However, a critical issue is to identify the kinetic and visual features that are responsible for uncanny feelings. It is markedly difficult to improve an android with highly complex structure through trial and error,” Ikeda said.
“The findings of the neural mechanisms of uncanny feelings should guide the development of human-friendly artificial social agents in efficient and practical ways,” he added.