Microchip Under His Skin

By , Staff writer of The Christian Science Monitor

When Kevin Warwick walks into his office, doors open, lights switch on, and a digitized voice says, "Welcome, Professor Warwick."

He could have created the same effects by tucking a smart card into the pocket of his tweed jacket. But by opting to implant a silicon chip under his skin last week, Professor Warwick claims to have become the world's first cyborg - "part man, part machine."

His daughter quips that he's crazy, and his wife says that the experiment "turns her stomach," but the soft-spoken chairman of the Cybernetics Department at the University of Reading in England, doesn't sound like a sendup from a sci-fi convention.

Recommended: Could you pass a US citizenship test?

"I come from a background of machine intelligence, looking at how intelligent machines are likely to be in the future," says Warwick, whose publications include books such as "Neural Network Engineering and Dynamic Control Systems."

"There is a school of thought that says that the way humans can keep up with machines is to have silicon implants helping our intelligence, but it's been a bit science-fictiony. I thought technically we can go at least part of the way in that direction, so I went and had a go at it. What I can do now is fairly limited, but it shows some of the possibilities," he adds.

The idea of man empowered by digital ability has always raised ethical issues and deeper questions about what it means to be human. Those issues get tougher as the distinction between man and machine blurs.

When humans started interacting with computers in a box in the 1950s, they called them tools. When students at the Massachusetts Institute of Technology in Cambridge, started wearing their computers in the 1980s, they called themselves "cyborgs." Warwick's experiment brings the computer literally under the skin.

There are good reasons to do this, Warwick says: For example, such chips could connect up with the human nervous system and help people with disabilities. "Imagine yourself directly connected with a computer, with the memory capacity of that computer at your disposal. Imagine being able to visualize with X-rays, ultraviolet rays, ultrasonic rays, infrared rays - to see in every way that a computer can see: That's where the forefront of technology is," he says, in a phone interview.

But the experiment also raises ethical issues he says he's not prepared to answer as a scientist. For example, a couple e-mailed Warwick the day after his experiment was publicized in Europe to ask if he thought such a chip would be appropriate for tracking their child, who is autistic and often wanders off.

"I haven't replied yet. I dare not reply. I really do not know what to answer. You can say, technically, let's put in a chip and maybe that will limit movement for a child, but the infringement on that child's liberty is enormous, even if you are autistic," he says. "There's a danger that human values and judgments will disappear because we will perceive the world very differently."

Has the experiment changed how he perceives the world or himself? "Just being a human being, you feel physically just what you are.... With this chip, I have a link with a part of me that is now separate. It's not just me being a physical thing as I am, it's something else out there. Some people would feel the same thing in a spiritual sense," he says.

The response to Warwick's experiment has been worldwide, according to the Department of Cybernetics at Reading University, which offers information about this experiment on its Web site (www.cyber.reading.ac.uk). Comments range from frank congratulations to concern over what Warwick calls "the Big Brother issue."

"We've set off a cyborg race," he says.

In fact, that race was well under way before an inch-long glass capsule found a temporary home inside Warwick's arm.

Soft side of cyborg revolution

One of the best views from the front is MIT's Media Laboratory, which has been pioneering new interfaces between humans and computers since 1980. Here, the "smart" beverage machine knows how much milk to put into a hot drink by reading a tag on the bottom of your mug, and the refrigerator has a scanner that tells you when the milk is getting sour.

Call it the soft side of the cyborg revolution, where computers are on you, not in you. Here, things that think help out "like a great, old-fashioned English butler, anticipating your needs in an unobtrusive way," says Alexandra Kahn, a spokesman for the Media Lab. Visiting corporate sponsors in suits wander through resolutely blue-jeaned work groups gleaning ideas for new products.

"People that like computers want to stay connected, and get free of the boxes that complicate their lives. Our goal is to have computers in much more natural ways," says Ms. Kahn.

Projects at the Media Lab range from things that think (smart rooms, countertops, toys, computers woven into a denim jacket) to electronic ink and machines that "recognize" human emotions. Recently, the Media Lab developed an enhanced cello for Yo-Yo Ma and a sneaker that digitally sends business cards from shoe to shoe when comparably shod colleagues shake hands. Welcome to the frontier of wearable computing, where men and women are taking computers out of the box, onto their persons, and into their lives in more intimate ways then ever before. And where the big names at the edge of computing aren't Microsoft or IBM but NIKE and Levi's. It's all new, edgy, and, more often than not, fun.

Try this one: You're walking down the street. Headed straight at you with outstretched arms is what's-his-name. You know you've met him. You even half-recall that he once saved you from a desperate plight. You just can't recall where, when, why, or, most importantly who.

Once, you had two choices - hail a cab or duck down an alley. Instead, call up the display screen imbedded in your wearable computer sunglasses, quickly reference your database of "People I once knew" images, and, if the bear hug doesn't knock "the glasses onto the pavement, you'll have just enough time to blurt out, "Adam Smith! Long time since senior prom!"

In fact, computers aren't quite fast enough to provide face recognition in all circumstances. "We have one last technical barrier: To recognize a face, you need to get good resolution of the eyes and be sure to get the face in the frame correctly," says Rehmi Post, a research assistant at the Media Lab.

Technical barriers are made to be broken, and the research agenda in the Media Lab is clear. Computers close to the body are better. "Our message is: Don't build another hard plastic package and concentrate on something that will be part of your clothing, so you can come to rely on it for all the digital information you need on a daily basis," says Mr. Post.

But back of all the nifty, gee-whiz stuff, are important discussions about how much technology people will accept into their lives. Commenting on the Warwick experiment, Kahn notes: "We haven't even figured out how to manufacture wearables yet. People are still creeped out by the idea of having systems in their homes."

When is humanity in doubt?

MIT sociologist Sherry Turkle has been studying how people interact with computers since the late 1970s. "These conversations show that the question of how you define the boundaries of the body have increasingly been on people's minds as they think about the computer, because people are beginning to sense that the computer is coming closer and closer to the skin," she says.

For example, she finds that when people are asked what they would be comfortable implanting if they had a chip that communicates directly with the brain, many make a distinction between instrumental knowledge, such as calculus or foreign languages, and a course on music or Shakespeare. They'll accept the chip's take on how to speak German but not on how to understand Goethe.

"It's almost as if people are trying to define what they would accept and what they wouldn't accept in terms of what makes them essentially human," she says.

The Warwick experiment is just an extension of the wearables project at MIT, Ms. Turkle says. "In fact, implanting a chip is not very far from wearing it on your glasses or having it in your ear. We find it disturbing, but the question is, will we find it disturbing in 10 years?"

Also to be resolved: What if the applications for close-to-the-skin computing turn out to be not so warm and fuzzy. Researchers at Reading warn that smart buildings could evolve into more than a cheery "good morning" for workers.

"Within businesses, individuals with implants could be clocked in and out of their office automatically," the University of Reading Web site states. "It would be known, at all times exactly where an individual was within a building and whom they were with.... Is this what we want?"

Share this story:

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.

Loading...

Loading...

Loading...