21st-Century Kitchen: Telling the Toaster How to Make the Toast
Prediction: You'll talk to your toaster. Future shock: It'll talk back.
Sometime in the next few years, perhaps by 2002, consumers will begin to install kitchens that understand human speech: not just words - like "yes" and "no" - but phrases and sentences. You'll order plane tickets from a computer by phone. You'll tell the porch lights to turn on and stay on until 10 p.m.
When those speech-enabled systems arrive, you'll have the breakthroughs of 1997 to thank for them. Thanks to some innovative engineering, you no longer have to pause between each word to have a computer understand you. Now you can speak in complete sentences.
The first use of these natural-language products is dictation. This month, IBM released its version of natural-language dictation, called ViaVoice. Lernout & Hauspie Speech Products will release its natural-language command-and-control system next month.
The really exciting step will come as other manufacturers incorporate the technology into their products. Three areas are getting the most attention right now: cars, consumer electronics, and automated telephone systems. Telephone systems will use it first, partly because consumers will likely accept it there first.
"Today, people feel funny talking to their toaster," says Rick Korfin, director of strategic business development for Belgium-based Lernout & Hauspie Speech Products. "But we talk into the phone."
Talking to a computer through a telephone will become a big business. IBM already has a lab-version of an airline-ticket reservation system based on speech recognition. So one day, you should be able to call up an automatic travel agent and order a ticket by saying you want to fly from New York to San Francisco tomorrow after 3 p.m. and prefer to use United Airlines. Such a system will have to go beyond transcribing words to understanding their meaning, and it will have to talk back to give you the requested information. David Nahamoo, manager of human-language technologies for IBM, predicts such systems will begin to enter the mainstream in three years.
Automobile manufacturers are also pushing forward with speech recognition. For example, prototypes already exist where you tell the car your destination and it gives you spoken directions for how to get there. Dr. Nahamoo envisions much more robust technologies within the next five years that could read e-mail to you and search voice-mail for a particular message during your morning commute. "I'm very much more optimistic this year than last year," he adds. The natural-language breakthroughs in the lab and the arrival of consumer dictation programs suggest that most of these speech-enabled applications will come about.
The talk-and-listen toaster, alas, is probably further away. (After all, why cram in a microchip and microphone when a lever and a dial have worked all these years?) But the technology may not reside on individual appliances anyway. It may sit on a central computer that automates your entire home.
Home Automated Living, a Burtonsville, Md., company, is set to release a speech-enabled system that allows people to control their appliances and activate their security systems. Hooked up to the video-cassette recorder, for example, you can tell it to record a certain program at 10 p.m. The speech-recognition system is not as sophisticated as the projects in the lab. It spots words it knows rather than trying to understand an entire phrase and you have to put certain specifics, such as time and date, at the beginning of the sentence. Still, it's a step toward the kind of home where things happen when you speak.
* Send comments to firstname.lastname@example.org or visit my "In Cyberspace" forum at www.csmonitor.com