Uncanny Valley: Will we ever learn to live with artificial humans?
How Japan's AKB48 has created a new level of artificial human – and what it tells us about the infamous Uncanny Valley.
A new member recently joined the Japanese band AKB48. Pop blogs and a magazine cover story introduced Aimi Eguchi as a sweet 16-year-old from a Tokyo suburb. But there was something strange about this new girl.
Her incandescent looks and sterling voice won Aimi instant attention and the center spot in a candy ad. Yet questions arose when she appeared in two video spots. Aimi seemed stiff and awkward – not uncomfortable, more like unnatural.
"There was a weird reaction to it," says Zac Bentz, a writer and reviewer for the online music store HearJapan.com. "Within a week or two, people were already saying, 'Well, this doesn't look right.' "
She always looked straight at the camera, shared a bizarre resemblance to other girls in the band, and had a peculiar beauty. Unsettled fans pestered the group, trying to figure out more about this girl, but she never appeared in public.
After a few weeks, AKB48 admitted that Aimi was computer generated. They took the nose, eyebrows, hair, and lips of six band mates and digitally stitched them into a new singer. (Watch a video of Aimi here.)
Fans had unwittingly tumbled into the "Uncanny Valley," the idea that people feel uneasy about things that appear nearly human but actually aren't real. While Americans often associate the phenomenon with animated movies, Japan has increasingly pushed the bounds in the music world. Aimi is not the first computer-animated character to chase pop stardom, and many assume "she" will not be the last.
"It was the next step in a long history of fabricating singers," says Mr. Bentz. The movement "is very Japancentric still, but it is a huge, huge thing over there."
Mapping the Uncanny Valley
While animators try to climb out of the Uncanny Valley, researchers still struggle to understand the science behind it. The term is now 40 years old, yet its existence remains mostly anecdotal, says Ayse Pinar Saygin, who this summer carried the field a strong step forward. The professor of cognitive science at the University of California, San Diego, led an international team of researchers that discovered that brains perceive humans and impostors in very different ways.
The team wanted to see how subjects would react to Repliee Q2, one of the most realistic androids in the world (pictured above). At certain angles, the machine could easily be mistaken for a Japanese woman. Once it starts moving, however, the illusion is shattered. Repliee has perhaps the most sophisticated set of motors and joints in an android to date. Yet it's still obviously a machine.
But forget conscious-level observations – Dr. Saygin wanted to peer deeper. She had subjects get into a London-based functional magnetic resonance imaging machine, which can track brain activity. Once inside, they watched 12 videos of Repliee performing simple tasks, such as nodding, waving, sipping water, and picking up objects from a table. The subjects then watched the same actions performed by a real woman – in fact the researcher whom Repliee is based on. Finally, they saw a third round of videos, this time starring Repliee with its skin removed to reveal the machine underneath.
The metallic robot and the real woman triggered very similar results. The woman sparked more neural activity, but it mostly occurred in the same regions of the brain.
"It was the android that stuck out," says Saygin. Those scans not only saw a greater response, but watching Repliee also lit up different areas of the brain. The biggest difference lay in the region that connects the visual cortex to the motor cortex. Saygin's team concluded that all this activity must come from the brain getting very confused. The android looked human: Why didn't it move like a human?
"The brain doesn't seem to care about biological appearance or biological motion, per se," she says. "What it wants is for its expectations to be met." The human and robot both played their part. But the android fell somewhere in between – into the Uncanny Valley.
Before taking part in the study, the 20 subjects needed to confirm that they had no experience working with robots and had never visited Japan nor had friends or family there. This precaution was important, Saygin says, because robots play a more accepted role in Japanese culture. For decades, Japan has integrated humanoids into entertainment, manufacturing, and domestic care. The country coined "Uncanny Valley" back in 1970 and could be the first to move past it.
"As human-like artificial agents become more commonplace, perhaps our perceptual systems will be re-tuned to accommodate these new social partners," says Saygin's study.
That's good news for Hollywood. After enough exposure to Repliee, high-definition video games, and the dead-eyed conductor from "The Polar Express" movie, maybe Americans will simply acclimate. Perhaps the Uncanny Valley doesn't need to be surmounted – it just needs to be filled in.
"Or perhaps, we will decide it is not a good idea to make [robots] so clearly in our image after all," the study adds.
Rise of the machines
Not everyone felt the heebie-jeebies around Aimi – a testament to how far computer graphics have come.
"I had no [suspicions] at all," says Yoshida Koki, an AKB48 fan in Kyoto, Japan. "On the contrary, I thought she was real."
The marketing stunt will not affect his opinion of the band, says Mr. Yoshida in an e-mail, but he thinks groups still shouldn't try to trick their fans.
"They should tell them," he says. "After all, one way or another it's going to come to light."
Yoshida echoes the opinions of many fans. It seems the band will face no blow back – at least not in Japan, where it's had eight No. 1 singles in the past two years, including one after the Aimi affair.
While this avatar may be the first computerized starlet to temporarily pass as a human, another singer has taken the stage as a proudly fake pop princess. With cyan pigtails down to her ankles and eyes wider than her fists, Hatsune Miku is undeniably a cartoon. Nonetheless, "she" performs for stadiums of ecstatic fans across Japan – and even had a show in Los Angeles this summer.
Miku sings and dances on stage as a hologram. Her moves and appearance come carefully crafted by marketers, but her voice is available for hire. Crypton Future Media sells home software that allows anyone to write songs for Miku to sing. The voice is based on a real singer, then modulated by Yamaha's Vocaloid software to make each word hit the right note.
"There's actually a huge range of characters on this Vocaloid software," says Bentz. "They each have their own voice and tones" – even back stories.
Crypton's website describes Miku as a 16-year-old (5-foot-2 and 92 pounds) who specializes in pop/dance songs between 70 and 150 beats per minute. The company, one of several that sells Vocaloid characters, also offers 20-year-old Megurine Luka for Latin jazz songs and the adolescent pop-rock duo Len and Lynn.
In an age when living stars lean on Auto-Tune – software that corrects or artfully distorts pitch – these Vocaloid personas fit right in, says Bentz.
"People who are really good at it can make a really convincing-sounding [song]," he says.
Miku (or at least Crypton) even released a No. 1 single last year and appeared in a Toyota commercial.
Japan's humanoid acts include androids as well, although machine performers have yet to achieve true stardom. Tokyo's HRP-4 "Divabot" waves, steps, and sings to a Vocaloid beat in front of human backup dancers. This simulacrum hasn't hit the big time, though. It's stuck in the robotics-conventions circuit for now.
Lest Americans tease Japan for its artificial tastes, Bentz reminds people that Western pop stars "are still kinda fabricated and overproduced. But at least they are usually real people."