Looking technology in the eye
Researchers are designing robots with more human characteristics, like skin and moving eyes.
(Page 2 of 2)
The MIT lab has developed a family of robots, including two famous, but now retired, ones: Cog, which tried to mimic the senses and movements of humans, and Kismet, a mechanical head with large, expressive eyes and facial features devised to socialize with humans.Skip to next paragraph
Subscribe Today to the Monitor
The lab is working on new robots that improve on Cog and Kismet. The most recent is Cardea, a personal assistant that, among other things, can open doors.
Cardea moves using the base of a Segway Human Transporter. The Segway, a scooter with a wide base and two large wheels, was created by inventor Dean Kamen. MIT and 14 other universities developed robots that could sit atop Segways for a project sponsored by the US Defense Advanced Research Projects Agency (DARPA).
Cardea is being designed to work with humans, and future applications might include an assistant for elder care, a host robot in buildings, and a personal assistant for office work. It has one arm, but eventually will have three and will be able to manipulate objects while in motion. Two of the arms could carry groceries, for instance, and the third could open a door.
As robots become more integrated into people's lives, they will need to look more human in order to be accepted, particularly if they work in households or care facilities or play an emotional role such as companion, say researchers. They have been refining the physical traits of robots.
The Kismet robotic head, for example, looked mechanical, but it had large eyes that made it look cute and friendly. But machines shouldn't look too human, scientists say. "There's a 'creep' factor if it looks too real," Edsinger quips.
Jeff Weber, an engineer at the MIT lab, is building a more expressive robot with a smaller head and eyes than Kismet. It will be able to move its mouth more freely, and tilt its head sideways to appear curious. It also will have a mechanical device that can move or control the arms so that they spring back if pushed down or if they hit something.
"A lot of the technology has to do with interacting with the environment without hurting someone," Weber says. He expects to build a basic head and two arms by this summer.
The University of Tokyo recently reported advances in creating an artificial skin for robots. The skin is made up of several layers, including a plastic film, a rubbery material, and a thin metallic layer. The skin is flexible and has 1,000 embedded organic transistors, electronic parts that can sense pressure.
The transistors still are not reliable enough for everyday use, says Dr. Sakurai of the University of Tokyo. He expects the skin to be used in products in five to 10 years. By then, a square foot of it could cost about $10, and it may include the ability to sense temperature, he says.
The Korea Advanced Institute of Science and Technology is designing robotic systems for the disabled or elderly, a surgical robot system, an assistant for the disabled in the workplace, and a service robot. The center will receive $1 million annually for nine years from the Korean Science and Engineering Foundation.
In 30 to 50 years, 30 percent of populations in many countries will be over the age of 65, says Dr. Zeungnam Bien, director of the robot welfare research center, in an e-mail. "It will be very important for this class of people to lead independent lives ... and various forms of welfare robotic systems will be the means of sustaining society," he says, adding that such systems will be ready in about 20 years.
Kazuhiko Kawamura, director of the Center for Intelligent Systems at Vanderbilt University, is focusing on how robots learn and communicate among themselves. Vanderbilt also developed a robot for the DARPA project using the Segway machine.
Mr. Kawamura's team is working to get that robot to communicate with another robot called ISAC (Intelligent Soft Arm Control system) using wireless technologies. His lab also is teaching robots behaviors, like finding a coffee cup and lifting it.
"We've made great progress in the last 20 years or so in integrating the body of a robot with sensors. We see this in the Sony and Honda robots," he says. "But this century, the challenge will be to integrate the robot's body and mind." That is an area where repetitive learning may someday turn into artificial intelligence, so a robot can learn and think on its own.