A new step forward for robots
Engineers decode human balance to build walking robots.
(Page 2 of 2)
Mr. Pratt says that humans can react to stumbles very quickly – faster than most machines. When you fall or lose your balance, it only takes about 0.43 seconds to respond. Current robots take as much as 0.6 seconds. That’s a lot of time when tumbling.Skip to next paragraph
Subscribe Today to the Monitor
“It’s really a key requirement when you’re talking about push recovery,” he says. To Pratt, the impressive thing about PETMAN and Big Dog – which he did not work on – is the speed at which they can move their legs in several directions.
On top of that, human legs don’t really flex. They are actually more of a pendulum, swinging relatively freely until you put pressure on them.
At that point, instinct kicks in. Your feet can land anywhere in a wide area without fear of losing balance, because they shift to keep stable. Yet most robots still have feet more reminiscent of a tripod’s stumps than an animal’s paws. Pratt says the key was figuring out how to program the robots so that they readjust after their feet hit the ground.
His robot, M2V2, was started at the Leg Lab. Since then, he has improved the design and can get the machine to stand on one foot, and even shift side to side when elbowed.
M2V2’s nimble feet came from Buckell University, where interim dean of engineering Keith Buffinton and his team designed a foot with pressure sensors. This new development gives M2V2 enough information to judge how to adjust balance. A flat-footed robot can’t do that.
All of this has uses beyond creating humanoid robots. Chris Atkeson, a professor at the Robotics Institute at Carnegie Mellon University, says his goal is to find out why older people tend to fall. If he could properly simulate that in a robot, he could test theories and develop new ways to help. “This is about understanding people,” he explains.
Similarly, in Japan, ASIMO serves a higher purpose than just entertaining during a press conference. Honda spokeswoman Alicia Jones notes that ASIMO’s development helped pave the way for devices designed to assist older people – a big concern in a country with a rapidly aging population.
Mr. Atkeson’s group employs motion-capture technology – similar to what Hollywood uses for realistic computer graphics – to figure out exactly how humans operate. He says there are two things that immediately distinguish humans from current-generation robots.
First is the ability to “damp” motion. If you slap a person’s palm, for instance, the hand will move a little, but only for a second before it goes back into position.
“Humans are well damped,” he says. “We have to find a way to absorb impact energy” in robots.
Second, to have any power, the motors end up moving very slowly because they are in low gear all the time. That’s a big reason it’s hard for them to move as fast as people react.
Another reason to make machines look humanoid is that, for robots to be useful, they have to move through a world designed for people. Houses aren’t built for wheels or creatures wider than they are tall.
Pratt adds that a bipedal robot could even replace humans in risky jobs such as space exploration – where having more legs makes machines a lot more effective on rough terrain.
A legged robot – with either two or four feet – would have had no trouble, Pratt says.