WHEN historians consider the electronic era, they will take a hard look at the lesson learned from a fingernail-size computer chip known as a DRAM.Americans invented the technology, dominated its markets until the early 1980s, then their lost market share to the Japanese in breathtaking fashion. Today, four of the world's top five chip-equipment companies are Japanese. Of United States corporations, only International Business Machines (IBM) remains a serious threat to Japan's dominance of this key technology. The lesson: Japan-like commercial arrangements in which large companies have access to markets, suppliers, and huge amounts of cash will win the technology race virtually every time. Japanese corporations are investing so much money in chip-making research and development that at AT&T Bell Labs, which developed much of this technology, advanced-lithography chief Richard Freeman worries that the US may have already lost the race. But technology has a funny way of rewriting history. A new and unexpected technology may allow the US to jump back into the race for the DRAM (pronounced DEE-ram for dynamic random access memory). DRAMs store data inside computers. Computer companies want more densely packed chips to make their machines smaller and lighter. Companies now mass-produce 4-megabit DRAM chips with tolerances of 0.8 microns (about 1/125th the size of a human hair). IBM and Toshiba, as well as other Japanese companies, have announced that they will start producing the next generation, 16-megabit chips later this year. The 256-megabit DRAM will require levels of precision down to 0.25 microns - so small that even the light waves used to make today's DRAMs are too big to work. So chipmakers face a choice: force-fit the current optical techniques with some very fancy techniques or shift to X-ray lithography. The Japanese popularized the first approach, known as phase-shifting masks. Making memory chips is like taking a picture with a stencil - or mask - on the lens. Light passes through the stencil onto chemically treated silicon wafers, creating a pattern. This pattern, when etched with chemicals, forms electrical circuits. When the openings in the stencil get smaller than the length of the light wave itself, the light refracts and distorts the image-pattern that chipmakers want. Several Japanese companies, and now AT&T Bell Labs and IBM also, are experimenting with masks that include semitransparent material around the edges of the gap. In theory, this material would shift the phase of the light around the opening, canceling out the diffracted light coming through the hole. In practice, that's proving difficult. The second approach is X-ray lithography. It has several advantages: extremely small wavelengths (0.001 microns) and more tolerance of dust on the production lines (because the X-rays go right through the particles). The challenge of the X-ray procedure is that, unlike optical techniques, chipmakers can't make big masks and reduce their size to the right level using lenses. Instead, they have to find ways of creating masks that are the same size as the wafers they want to etch. IBM has invested 10 years and hundreds of millions of dollars in X-ray research. Since all the big players are pursuing both approaches, no company will gain a huge advantage when one technology wins out, says Daniel Fleming, director of IBM's semiconductor development center. But a small company, Hampshire Instruments based in Rochester, N. Y., could upset the apple cart. Several US companies, including Motorola and AT&T, are testing Hampshire's innovative X-ray system to see if it will work in a mass-production setting. If it does, then the company plans to sell to US companies first, giving them a two- and perhaps four-year head start on the Japanese, says Hampshire Instruments president Moshe Lubin. He predicts DRAM production will shift back to the US. So the lesson of the DRAM is up for grabs again. Sometimes the victory goes to the strongest competitor, sometimes it goes to the nimblest. The trick is to nurture both.