Robotic Einstein learns to put on a happy face

By

He smiles, looks sad, shows surprise, and he learned how to make these facial expressions all by himself (sort of).

A new baby on the block? No, its a robotic head, crafted to look like Albert Einstein. And researchers have programmed it to train itself to figure out how to arch an eyebrow, wrinkle a nose, and furrow a brow, all in the name of mimicking human facial expressions.

The work is being done by researchers at the University of California at San Diego's Machine Perception Lab.

Recommended: Could you pass a US citizenship test?

Robotics researchers generally have long taken inspiration from Kismet, a mechanical head at MIT that is programmed to generate facial expressions, then give the correct one in response to a human "care-giver."

The goals of such work are twofold: to devise ways of making humans more comfortable as interactions with machines become more common; and to develop mechanical analogs that humans scientists can use to test ideas about how babies expand their capacity to respond to their new world.

Kismet, who resides at MIT's artificial intelligence lab, has been the mechanical pioneer in this field. Scientists and engineers there have developed a head only a hardware store-owner could love. But they've devised ways of programming Kismet to react to someone the way a baby might.

Kismet's sound and vision devices allow it to detect sight and sound cues from a human, then change its facial expression -- the direction of its eyes, tilt of its ears, the position of its mouth, for instance --  in ways a human would recognize and that constitute a realistic response to the human's initial smile, compliment, laugh, or scold.

Now, researchers at UCSD have taken facial expressions a step further by having Einstein learn them himself, rather than having them programmed for him. You can find a PDF of the formal research paper describing the results here. Or you can read a plain-English description of their work here.

The team wrote software then helped Einstein interpret the facial expressions in videos of humans. Then, using the images as a reality check, the robotic head used a form of facial "babbling" to gradually organize random movements into specific expressions -- much as a baby might.

One key is having access to a realistic looking face. But it still requires some clever engineering. Humans have some 44 muscles acting and interacting in subtle ways to generate the range of facial expressions people display.

In Albert's case, researchers installed 31 motors, 27 of which govern facial expressions.

The experiment is one step in a process. This time around, the good Dr. Einstein learned individual expressions. The team reports that it's now working toward a system that allows the android head to combine elements of these expressions to display more-complex expressions, such as happiness, anger, or sadness.

Share this story:

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.

Loading...

Loading...

Loading...