It's touch and go before a touch robot can 'go'

John Purbrick struts over to a workbench in his cluttered ninth-floor office, presses his finger on a small rubbery pad, and watches as an image of the imprint of his finger lights up on a nearby screen.

What he has produced is a ''touch picture'' - a seemingly simple visual image of the imprint of his finger. But it is actually one small part of a technological challenge now being tackled in laboratories around the world: to produce touch-sensitive devices for robots.

''Present robots are set up to go to a certain place and do a certain task,'' says the denim-clad researcher at the Massachusetts Institute of Technology's Artificial Intelligence Laboratory. ''What we hope to do next is provide them with sensors that enable them to obtain information from the environment. . .''

If ''touch-sensitive robots'' sounds like humanoids that would tan if taken to the beach or laugh if tickled - they're not (a computer program could be written to cause a robot to chuckle if you brushed against it). Instead, the touch sensors being developed in labs are much simpler and are able to tell a robot the shape of an object, how hard its grippers are clutching, and whether something is slipping from its grasp.

These are small clues, but many experts believe such clues will be essential to the next generation of factory automation. When combined with ''vision'' and other sensing capabilities, touch sensors are expected to help produce a new kind of ''intelligent'' robot capable of performing tasks ranging from assembling tiny circuit board parts to repairing nuclear power plants.

Rudimentary touch-sensitive devices exist in abundance now. The refrigerator button that pops the light on when the door is open is a simple one. On the factory floor, a few robot grippers have sensors capable of gauging the force with which they grab an object.

But the new generation of tactile sensors being developed more closely mimic, in a primitive way, the capabilities of human touch. The aim is to come up with robots that can ''feel'' the shapes of objects, identify them, and respond to the task at hand.

Development of another ''sense'' for robots, vision, is further along, but still at an early stage. Most industrial robot optics now in use can't tell whether a fork is lying prong side up or down.

At Carnegie-Mellon University scientists have developed a rubbery sensor pad that can distinguish among six objects, ranging from a battery to a wrench.

John Purbrick's work, though different, achieves similar results. He uses a two-inch-square pad containing 256 electronic sensors protected by caulking material. When an object is pressed against it, the sensors pick up its outline and, through different intensities of light, show how hard it is being clutched.

The aim of all this is to produce a generation of more flexible ''steel collar'' workers. At present, the vast majority of robots neither ''see'' nor ''feel''; most can only open and close hand grippers. If the materials they're handling aren't perfectly oriented when coming down the assembly line, the robots can't perform their task.

Giving them the ability to ''feel'' objects would allow robots to handle small assembly jobs - for instance, installing typewriter springs - and dip into bins to pluck out parts. It would also allow them to ''adapt to the workplace environment.'' If the robot picked up a bolt that was upside down, the machine could ''feel'' the problem and correct it.

Researchers want, ultimately, to put sensors on multijointed fingers, combining the sense of touch with finger dexterity. But there are limits. ''We won't get to the point of duplicating a human hand,'' said Fred Renner, a researcher at Battelle Laboratories. ''But we might get hands that have several fingers, a reasonable amount of dexterity, and enough sensing capability to compensate for deviations on the job,'' such as righting an upside-down bolt.

It won't be until at least the late 1980s that many tactile-sensing robots will inhabit the workplace. Producing a robot hand with enough touch sensitivity and dexterity to screw a nut and bolt together is probably five years away, says Purbrick.

You've read  of  free articles. Subscribe to continue.
QR Code to It's touch and go before a touch robot can 'go'
Read this article in
https://www.csmonitor.com/1983/0426/042628.html
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe