When a cloud of debris floated westward from Chernobyl in the Soviet Union last week, it put a lot of things into grim but needed perspective. Nuclear peril does that: It sweeps aside less important things, forcing a reassessment of what really matters. In this case, it raises profound questions about two characteristics of modern society we sometimes take for granted. One is technology. The other is bureaucracy. Technology. So far, 1986 has been a tough year for technological prowess. The explosion of the American space shuttle in January, followed by the loss of a Titan rocket last month and a Delta rocket last Saturday, have etched question marks on the future of the space mission, involving one of the world's most advanced technologies. With Chernobyl, another complex technology -- the generation of nuclear power -- has been challenged to the core.
What's really being challenged, however, is mankind's reliance on supertechnology -- one of the most pervasive idolatries of the age. On the surface, that may appear to be no more than a blind trust in things and objects, an age-old worship of materialism dressed up in modern clothes. It is that, certainly. But it's also more.
The complex systems of the modern world are far beyond the layman's ken. They are unlike the 19th century's ``modern'' mills and machines: Most of us, however well-read, cannot even begin to explain their most basic principles to our children. We depend, as never before, on experts. So does this mean that those principles are somehow perverse and anti-human? Not at all. It only means that they are complex. It means, too, that if the systems fail, it is not because some abstract thing called ``technology'' has failed. It is because the human resources behind the technology have failed.
Human resources can be organized in various ways. Little technological systems -- like the ones sprouting in America's Silicon Valley, Research Triangle, and other seedbeds of entrepreneurship -- seem to work fine with only a handful of experts. Such people need minimal organization. But systems designed to fling men and women into space or wring power from atoms are vast, complicated, and delicately balanced. They require hundreds of experts. And handling hundreds of people requires a very different sort of organization from handling a handful. That's what we call a bureaucracy.
Bureaucracy. What the Challenger inquiry is revealing -- and what a similar inquiry at Chernobyl would presumably uncover -- is the fact that expertise is no better than the bureaucracy supporting it. If that bureaucracy is sufficiently riddled by turf disputes, information blockages, political pork-barreling, or managerial ineptitudes, then no amount of expertise can hold either spacecraft or reactors together.
Unless we grasp this point -- that the real culprit here is bureaucracy and not technology -- we're liable to be led gravely astray. Playing Cassandra over every new technology, we'll tend to resist every new turn of progress and walk backwards into the future. The point is not to scrap the entire space program or deactivate every nuclear reactor. The point is to stop trusting blindly in experts and organizations.
In America, the entire operation of NASA is now getting a thorough grilling -- to see just how much of our trust it deserves. Moscow has a greater challenge. Nuclear clouds are no respecters of boundaries. So the Soviets face intense European pressure to prove that their entire nuclear power bureaucracy is trustworthy.
In the end, the question is not only, ``How do you design a safe supertechnology?'' but ``How do you design a safe superbureaucracy?'' That, simply put, is a central problem facing the world today. And it is exactly there that the differences between communism and the free world stand in stark contrast.
NASA, one of the free world's pre-eminent scientific bureaucracies, may have its flaws. But they'll be corrected -- or, if they can't be, the entire organization will be dismantled. That's how public opinion operates in the West: That's the value of a free press, a free thought, a free sharing of information wherever possible. That's probably the single best guarantee of technological safety we can have.
Can that happen at Chernobyl? The early signs -- especially the clampdown on information from the Soviet Union -- are hardly promising. Everything points to a bureaucracy fighting to save its own skin -- hoarding information, wishing away the problem, fending off any probe that could let in the light, trying to maintain the status quo.
The status quo failed at both Cape Canaveral and Chernobyl. How the two superpowers deal with their failures will show the world, in sharp detail, the fundamental differences between them.
A Monday column