Looking technology in the eye
Researchers are designing robots with more human characteristics, like skin and moving eyes.
In a decade or so, people may not have to tidy their house, clean up after the dog, or even nag their spouse to do chores. A friendly, human-like robot will take care of routine tasks, and it won't whine or fight back.
If technologists' predictions bear out, this second coming of robots could be more pervasive than the first in the '60s, when industrial robots revolutionized manufacturing.
Designed to mimic the look and gestures of humans, the new breed of personal robots eventually may have artificial skin and muscles, as well as eye and facial expressions, and they might speak more naturally.
But for this rapidly evolving field to take off, scientists will have to improve the quality and reliability of electronics first, and companies will have to find the application that every household must have.
Perhaps it will be a robotic housekeeper, or a companion for the elderly. Right now, no one knows for sure. But the one discernible trend is that, in the future, machine assistants that interact with humans will look more like us.
"This will be bigger than the automobile market in 20 years," says Takayasu Sakurai, professor at the University of Tokyo's Institute of Industrial Science, via e-mail. Dr. Sakurai's team has developed artificial skin for robots.
Honda Motor Co., Sony Corp., and other companies have created robots that could be precursors to tomorrow's more personal robots.
Sony's QRIO, which stands for Quest for Curiosity, can sing and dance. Recently, Sony added the ability to run 15 yards a minute, lifting both feet for an instant - an ability that Sony claims is the first for a robot. If QRIO falls, it can look from right to left and back to the front before bending its elbows and knees to push itself upright. The 23-inch-tall QRIO, which looks like a friendly astronaut, also has a video camera, sensors for balance and posture, and a CD player. Sony does not plan to sell it.
In the fall of 2000, Honda debuted its ASIMO robot, which stands for Advanced Step in Innovative MObility. It was one of the most advanced walking robots at the time. Twenty-six motors help the 4-foot, 115-pound machine climb stairs and turn corners. Honda is currently working to make the robot more intelligent. It walks 1 mile per hour, but eventually could walk three times as fast.
A question on Honda's website asks: "How much more advanced will ASIMO be in, say, 10 years?" The answer: "In 10 years, maybe ASIMO will be answering tough questions like these by itself."
Researchers may question the optimism and timing of such advanced robots, but many agree the field is moving ahead quickly. "It's amazing what Honda has been able to accomplish," says Aaron Edsinger, a graduate student in the Computer Science and Artificial Intelligence Laboratory at the Massachusetts Institute of Technology (MIT).
Advances in robotics will be driven by potential applications, researchers say. To date, most applications have been in industry, with about 770,000 robots working worldwide now, almost half of them in Japan, according to the World Robotics 2003 report by the United Nations Economic Commission for Europe (UNECE).
But sales of service robots for personal and private use are expected to almost quadruple over the next few years. By the end of 2002, sales of automated assistants, which include those for autonomous lawn-mowing and vacuum-cleaning devices like iRobot Corp.'s Roomba, topped 600,000, according to UNECE. The UN group predicts that 2.1 million service robots will be sold from 2003 through 2006 and that they will increasingly become everyday tools for mankind. These figures don't even include the potential for future human-like robots that scientists currently are developing.
Rodney Brooks, head of the MIT lab, has said the state of robotics now is where computers were in the late '70s, when they were confined to labs and hobbyists and were clunky and expensive. A Sony executive has reportedly estimated that if QRIO were to go on sale right now, it would cost about the same as a luxury car.
"But that could change in a decade if you drive down prices and find a 'killer' application, like word processing or spreadsheets in the case of computers," Mr. Edsinger says.
The MIT lab has developed a family of robots, including two famous, but now retired, ones: Cog, which tried to mimic the senses and movements of humans, and Kismet, a mechanical head with large, expressive eyes and facial features devised to socialize with humans.
The lab is working on new robots that improve on Cog and Kismet. The most recent is Cardea, a personal assistant that, among other things, can open doors.
Cardea moves using the base of a Segway Human Transporter. The Segway, a scooter with a wide base and two large wheels, was created by inventor Dean Kamen. MIT and 14 other universities developed robots that could sit atop Segways for a project sponsored by the US Defense Advanced Research Projects Agency (DARPA).
Cardea is being designed to work with humans, and future applications might include an assistant for elder care, a host robot in buildings, and a personal assistant for office work. It has one arm, but eventually will have three and will be able to manipulate objects while in motion. Two of the arms could carry groceries, for instance, and the third could open a door.
As robots become more integrated into people's lives, they will need to look more human in order to be accepted, particularly if they work in households or care facilities or play an emotional role such as companion, say researchers. They have been refining the physical traits of robots.
The Kismet robotic head, for example, looked mechanical, but it had large eyes that made it look cute and friendly. But machines shouldn't look too human, scientists say. "There's a 'creep' factor if it looks too real," Edsinger quips.
Jeff Weber, an engineer at the MIT lab, is building a more expressive robot with a smaller head and eyes than Kismet. It will be able to move its mouth more freely, and tilt its head sideways to appear curious. It also will have a mechanical device that can move or control the arms so that they spring back if pushed down or if they hit something.
"A lot of the technology has to do with interacting with the environment without hurting someone," Weber says. He expects to build a basic head and two arms by this summer.
The University of Tokyo recently reported advances in creating an artificial skin for robots. The skin is made up of several layers, including a plastic film, a rubbery material, and a thin metallic layer. The skin is flexible and has 1,000 embedded organic transistors, electronic parts that can sense pressure.
The transistors still are not reliable enough for everyday use, says Dr. Sakurai of the University of Tokyo. He expects the skin to be used in products in five to 10 years. By then, a square foot of it could cost about $10, and it may include the ability to sense temperature, he says.
The Korea Advanced Institute of Science and Technology is designing robotic systems for the disabled or elderly, a surgical robot system, an assistant for the disabled in the workplace, and a service robot. The center will receive $1 million annually for nine years from the Korean Science and Engineering Foundation.
In 30 to 50 years, 30 percent of populations in many countries will be over the age of 65, says Dr. Zeungnam Bien, director of the robot welfare research center, in an e-mail. "It will be very important for this class of people to lead independent lives ... and various forms of welfare robotic systems will be the means of sustaining society," he says, adding that such systems will be ready in about 20 years.
Kazuhiko Kawamura, director of the Center for Intelligent Systems at Vanderbilt University, is focusing on how robots learn and communicate among themselves. Vanderbilt also developed a robot for the DARPA project using the Segway machine.
Mr. Kawamura's team is working to get that robot to communicate with another robot called ISAC (Intelligent Soft Arm Control system) using wireless technologies. His lab also is teaching robots behaviors, like finding a coffee cup and lifting it.
"We've made great progress in the last 20 years or so in integrating the body of a robot with sensors. We see this in the Sony and Honda robots," he says. "But this century, the challenge will be to integrate the robot's body and mind." That is an area where repetitive learning may someday turn into artificial intelligence, so a robot can learn and think on its own.