Computing With the Speed of Light


PICTURE what would happen if, in the next second, every car and truck became a pulse of light. Traffic would zip along at amazing speed. There would be no stop lights or choke points. Cars would pile onto one other to cross a narrow bridge; at intersections, they'd run right through each other. (Light pulses have no mass, so they never crash, even when they intersect.)

What light will never do for highway traffic, it may accomplish inside computers. Researchers are trying to break down the barriers to a technique called ``optical computing.'' If they succeed, the properties of light will greatly expand the capabilities of today's fastest and most powerful machines.

Optical computing is actually a misnomer. Scientists will probably never build a true optical computer, which would work solely with photons of light instead of electrons. Instead they are working on hybrid systems that marry the two technologies. Some researchers call these machines optoelectronic computers.

Today's electron-based, digital systems are very good at many kinds of calculations. But there are some calculations that an optical system can handle far better and faster. These are huge sets of simple equations, which a sixth-grader could handle if he had years to calculate all the possible calculations. Even a supercomputer would struggle mightily, because the number of computations grows explosively with the number of equations to be solved. An optical system could handle all the data at once - processing the information in parallel - and deliver the answer virtually instantly.

Computer companies continue to build faster machines by squeezing in more electronics. It's like adding another tollgate lane to a busy turnpike. Just like cars, however, electrons slow down as the highway gets crowded; worse, they begin to affect one another's movements, causing unwanted computer noise and interference. Photons never have that problem.

Optoelectronic computers can use light to send information from one processor to another. That's a lot like a long-distance telephone call. It starts as an electron-based signal, gets converted into photons, and travels by fiber-optic cable. When it nears its destination, it is converted back to electrons.

Actually, A call on the American Telephone & Telegraph Corporation (AT&T) fiber-optic network goes through such a conversion every 40 to 60 miles, when it meets an electronic repeater. AT&T plans to replace the $800,000 repeaters, which enhance the signal, with cheaper optic amplifiers. That will reduce the number of these conversions and increase the system's reliability.

This traffic function isn't really optoelectronics. Some optical processors also use light to make calculations. In the mid-1990s, AT&T expects to install new telephone switches that will handle signals as photons instead of electrons.

AT&T is not alone in pushing this once-visionary technology. British Telecom is working on it. Japan's government and 13 Japanese companies have launched a 10-year optical research program. OptiComp Corporation, a small company near Lake Tahoe, Nev., plans to unveil a prototype optical computer workstation this summer. It is a hybrid machine, aimed at convincing skeptics and investors that the technology has arrived.

The company, incidentally, claims its founder developed the first digital optical processor, not AT&T.

In some applications, the technology has arrived. The military uses optically based systems for electronic warfare and missile-guidance systems. Optic computers are better than traditional machines at discerning signals and images.

There are less glamorous uses too. An optical system on an assembly line could judge whether labels on, say, ketchup bottles were properly placed.

While optoelectronics is years away from catching up with electronics, its star is shining more brightly these days. One barrier to the technology is computer memory. It's of little practical use to have the power of optics to, say, ``read'' a book page by page, if the memory is only handing it out word by word. Researchers appear to be a couple of years away from that breakthrough.

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.