Pasadena, Calif. — A milled aluminum box, 5 feet long, 8 inches high, and 14 inches deep, sits haphazardly on a workbench in the basement of Caltech's computer science building. In front of the box, a series of quietly whirring fans blows a steady stream of air through its component-crammed interior. A gray, ribbonlike cable is its only connection to the outside world.
Like a book, though, you can't judge this computer by its cover. The power encased in this box, or ones like it, could make possible a number of scientific breakthroughs in coming years.
The unprepossessing aluminum box houses a unique computer designed to solve three-dimensional problems. It's the result of collaboration between computer scientists and physical scientists at the California Institute of Technology, and with it researchers hope to attack a number of intractable scientific problems.
At the same time, researchers here may be developing the basis for a powerful United States riposte to the latest Japanese technological threat.
Caltech's computer, dubbed the Cosmic Cube, is fundamentally different from traditional machines. At every beat of its internal clock a conventional computer performs a limited number of operations. This means that at any given time, the vast majority of a computer's circuits are idle - an inefficiency that becomes more burdensome as the machine's power grows. In the past, computer designers have circumvented this by making devices that perform millions of operations per second. But supercomputers have begun to run up against the limitations inherent in this design: The cost of squeezing out a little more speed and power has grown steadily.
Thus, while most people have been finding computer power more affordable, scientists in many disciplines have been stymied because the problems they need to solve are too big for even the world's largest computers to tackle in a reasonable time and at a reasonable cost. A number of these problems involve modeling in three dimensions physical processes, ranging from events that take place in the core of an atom to the evolution of galaxies. It is specifically to tackle these problems that the Cosmic Cube was designed.
''We believe this class of machine is the only way we can get huge increases in computer power at a reasonable price. This is an exciting prospect because it could open up a number of new fields of research,'' says Geoffrey Fox, a cherub-cheeked high-energy physicist who has been a key figure in the computer's development.
While not a physical cube, the architecture of this machine is like a cubic crystal. It consists of an array of identical microcomputers, each very similar to that found in the IBM Personal Computer. Each of these microprocessors, called nodes, is connected to six others, just as each junction in a cubic lattice is linked to its six nearest neighbors. The current machine has 64 nodes. The Caltech group will add 200 more to it next summer. Ultimately, they hope to build a machine with 1,024 nodes.
Partly because the Cube uses computer chips that are the most cost-effective on the market, the 64-node machine has the power of a typical supermini used for scientific work, but at one-tenth of the cost, says Charles Sitz, the machine's designer.
Another secret to the Cosmic Cube's power is that all its microprocessors work at once. The generic name for this approach is concurrent processing. There are a number of projects of this sort. But the Caltech effort is distinguished by its dedication to scientific problem solving.
The most sophisticated of the commercial supercomputer designers is Cray Research Inc. So the Cray computer has become the standard for comparison. A thousand-node Cosmic Cube would have the power of a Cray, Mr. Sitz says, but would cost only $50,000, not $5 million, to produce.
''Of course, this computer will never totally replace a Cray because it is not nearly as versatile,'' he adds. In essence, the Caltech computer sacrifices versatility for efficiency in solving specific types of problems.
''These are problems characterized by very long computation times, not because the calculations are difficult, but because they are being exercised in a very large 'world,' '' Dr. Fox elaborates.
To study complex physical processes with a computer, scientists generally devise a simplified mathematical model. They take the region they are interested in, ranging from the volume of an atom to the extent of a galaxy, and chop it into pieces. They then write mathemat-ufnext,20p4COSMICCOSMICufmrk,51lical equations describing the various processes that take place in each piece and how that affects each other piece. When coded into and run through a computer, this can give valuable insights. But, because of the limits of computer power, scientists often find themselves limited to two dimensions.
In the Cosmic Cube each microprocessor is assigned the task of calculating what goes on in each piece of such a simulation and communicating pertinent results to adjacent nodes.
In a simulation designed to see if the current theories on the basic nature of subatomic particles correctly predict the mass, size, and electrical charge of the proton, the 64-node Cube runs 62 times as fast as an ordinary machine its size, Fox reports.
A 1,000-node machine should allow geophysicists to model rock structure in three dimensions, aerodynamicists to study the performance of aircraft wings in much greater detail, and chemists to tackle much more complicated chemical reactions than is now possible, the physicist says. But to solve the proton problem completely, which is four dimensional, will probably require a machine 10 times as larger, he adds.
Meanwhile, Sitz is working on an attempt to shrink each node from an 8-by-14 -inch board to a single chip.
''Just imagine, the power of a Cray in a cubic foot, running on a thousand watts of power!'' he exclaims, then adds soberly, ''Do you realize that this is the first time since the very early days that universities have been working at the cutting edge of computer design?''
And it's here that the work at Caltech may have importance for the US computer industry's competition with Japan. From the beginning of the computer age, the United States has held a monopoly on supercomputers, gigantic number crunchers that calculate several times as fast as conventional mainframe computers. Although only a few hundred such machines are sold each year, supercomputer development has helped the US keep its lead in computer technology. And some experts predict that demand for these ultimate calculators in industries such as aerospace, electronics, and oil is about to explode.
But leading Japanese computermakers are mounting an all-out challenge to the US lead. In the next few months, Hitachi Ltd. and Fujitsu Ltd. say, they will begin delivery of new supercomputers that rival current US machines. The openly avowed Japanese goal has caught the attention of the US government, which is spending about $100 million for supercomputer development. The Caltech effort, funded by the Department of Energy, is one beneficiary of this heightened sensitivity.