IN Robert Larrabee's world, a dust particle is a ``bowling ball'' and dandruff is a dreaded enemy. It is a world that measures things not in feet, or inches, but in micrometers, one one-thousandth of a millimeter. (The diameter of a piece of hair is about 100 micrometers.) In this world, the battle for technological leadership is being waged on the head of a pin -- and in some major respects, the United States is losing to the Japanese.
It is Dr. Larrabee's task, as head of the microelectronics dimensional metrology group at the National Bureau of Standards, to keep a microscopic vigil on this world. The task is becoming harder as technology for semiconductors -- the brains that make computers run -- advances at its breakneck speed. Cramming ever more information and circuitry into these computer chips is the only way to keep competitive in world electronics, not to mention defense technology. But that poses problems for Larrabee -- and ultimately for industry.
``The dimensions now have gotten so small that it stretches the ability of light to see it,'' he says.
That affects US industry's ability to compete in high-tech -- and, that, some fear, could eventually spell trouble for America's defense technology.
Shrinking the computer chip has brought computers into the home and robots into the factory. A chip the size of an average man's thumbnail can run a complex telephone system; such power would have taken a roomful of transistors in the 1960s. Some figure that by the turn of the century, you may be able to fit the computing power of 10 supercomputers into your wristwatch.
But such progress creates a host of new problems for companies making the computer chips. These are constructed on silicon wafers by making component parts of a circuit, such as transistors, as close as possible to each other, often using a process similar to printing a picture from a negative. If two parts of the circuit are too close together, they can short-circuit to each other and make the whole chip useless. If the component parts are too small, they won't be able to carry the required current, and again the chip can fail.
And then there's dust. ``Dust is a great enemy of the people who make these integrated circuits,'' Larrabee says. If a particle of dust falls onto the transistor they're making, ``it's like having a bowling ball sitting on top of something to be printed.''
Semiconductor manufacturers give the word ``clean'' a new definition. The most advanced laboratories have only one dust particle bigger than a micrometer per cubic meter of air, about the same volume as a medium-size refrigerator. Before entering the production area, people don bulky gowns from head to toe and pass through a ``hurricane room'' in which air is blown over them at 100 feet per minute. Even with such precautions, dust, lint, dandruff, or other objects manage to find their way into the room, sometimes ruining a $20 chip.
Such mistakes are very expensive for American semiconductor manufacturers. Howard Bogert, a semiconductor analyst at the computer research firm Dataquest, figures that American chipmakers throw out 15 to 20 percent of all their semiconductors because of defects. ``You're talking billions of dollars,'' he says, perhaps $4 or $5 billion in a $25 billion industry.
For every discarded chip, a manufacturer has to raise the price of the working chips. Here, the American companies are at a competitive disadvantage vis-`a-vis the Japanese, who have a higher ``yield'' rate than US companies. Since the Japanese industry is newer than that of the US, so are its production processes.
``That gives them an advantage in leading-edge technology,'' Mr. Bogert says, because it's the denser, more circuit-packed semiconductors that are more likely to be fouled up in the production process.
And that's where the National Bureau of Standards comes in. Larrabee and his colleagues can't do anything about dust and lint. But they can give manufacturers a benchmark to check up on their semiconductors at different stages of the manufacturing process.
Obviously it would be too expensive to check a chip at each of hundreds of production stages. But by doing it at several stages, people can take corrective action so that subsequent chips aren't fouled up by the same mistake. This has worked fairly well until recently. But now, the chip ``is getting so small that it doesn't look the way you'd expect it to under an optical microscope,'' Larrabee says.
This gets to the fundamental physics of the universe. The wave lengths of visible light are about 0.6 micrometers, about half the size of the lines in state of the art semiconductors. That means the error in measuring the lines and spaces between the lines is so great that the processing of the chips can get out of control.
The answer is to come up with ways to measure sub-micrometer dimensions more accurately. The bureau thinks it can stretch the current optical technology a little more, to several tenths of a micrometer, but future circuits may require more precision. The next technology -- shooting an electron beam at a computer chip and forming an image of the target -- is rife with problems.
But once the bugs are worked out with this or other new techniques, scientists hope to use electrons to measure things on an atomic scale. Noting that there are thousands of atoms in a micrometer, Larrabee observes with some understatement: ``Then we're down to another scale of dimension.''