Most people, thinking about contemporary trends, are in agreement on two points. First, we're in the midst of a ``knowledge explosion'' of awesome proportions. Second, we're seeing a significant decline in public and private morality. But try to link the two -- try questioning whether the expansion of information technology can coexist with sound morality, or whether the two are mutually exclusive -- and the argumentative fur begins to fly.
The notion of such a linkage is not new. The legendary Dr. Faustus, to slake his thirst for knowledge and the power it produced, sold his soul to the devil. His intellectual heirs -- mad scientists, as ingenious as they are inhumane -- abound in today's television dramas and pop ular novels. Meanwhile, the relation of ethics to technology continues to perplex a nation searching for ways to regulate research in biogenetics, medicine, nuclear power, and other fields.
On one side, of course, are those who insist that technology so enhances life that it allows us to do more good deeds than ever. On the other side are those who fear that an increasingly mechanistic view of man is so eroding our basic understanding that we no longer know what ``good deeds'' are. Given the distance between these two views, one suspects that the right questions about technology and morality are not even being asked.
So it's a pleasure to encounter the ponderings of Tufts University professor Daniel C. Dennett, one of a growing cadre of techno-humanists who think hard about both philosophy and technology. In the latest issue of Daedalus (which takes ``Art and Science'' as its overall subject), Professor Dennett squarely addresses what he calls ``the relationship between technology and morality.''
``I wish to consider the possibility,'' he writes, ``that information technology, which has been a great boon in the past, is today poised to ruin our lives -- unless we are able to think up some fairly radical departures from the traditions that have so far sustained us.''
As exhibit A, he raises the case of the highly complex computerized systems now being developed for organizing medical knowledge. These so-called ``expert systems'' -- if they prove workable -- would be fed with the answers to questions asked of the patient by the doctor. They would respond with a diagnosis.
But what effect would such systems have, asks Dennett, on one of America's most venerable symbols of virtue -- the kindly country doctor? Such doctors, he argues, typically ``Know their patients well; their personal, intricate, involved knowledge stands them in good stead when they come to diagnose, treat, and advise the members of their communities.'' But since expert systems would be as close as a computer-to-telephone link -- and since even the most rural of doctors has a telephone -- it would be ``a gross dereliction of duty'' to avoid them.
And that, worries Dennett, could change things drastically. No longer will country doctors be able to maintain their old and often proven ways of working. In the face of newly accessible knowledge, there will be little demand for what he calls ``funky, hand-made medical care -- just like Grandma used to get.''
So the country doctors, says Dennett, will ``begin to sink into the role of mere go-betweens, living interfaces between patient and system.'' The art of diagnosis will be replaced by the science of following directions. And the rural doctor, whose insight was once so valuable, will risk becoming what Dennett describes as ``a health-care doorman.''
For Dennett, this is only an example of a much larger issue: the relation of knowledge to morality. He argues that our ancestors, like the country doctor, had few means for learning about ``non-local, non-immediate'' issues. As a result, ``they could plan and act with a clear conscience on the basis of a more limited, manageable stock of local knowledge,'' he says. ``They were thus capable of living lives of virtue -- of a virtue that depended on unavoidable ignorance.''
That's no longer the case. In a single crucial sentence, Dennett sums up the changes. ``Information technology,'' he argues, ``has multiplied our opportunities to know, and our traditional ethical doctrines overwhelm us by turning these opportunities into newfound obligations to know.''
One must quarrel, to be sure, with Dennett's modern version of the ``noble savage'' theme -- his presumption that virtue and ignorance go hand-in-hand. History has too many examples of ignoble savagery -- and of the disasters wrought by ignorance -- to let us equate righteousness with lack of knowledge.
But his central point raises some sobering questions. Have our ``traditional ethical doctrines'' elevated the pursuit of knowledge -- any kind of knowledge -- into an absolute, unquestioned good? Do we really feel that we have ``obligations to know'' whatever can be known?
And if so, are we caught in a classic Catch 22? Do these obligations translate, on the one hand, into a profound sense of guilt concerning all the papers unread, the newscasts unheard, the books unbought? Or, on the other, do we bow to the obligations and so fill ourselves with facts that the quiet, accumulated, intuitive wisdom -- the wisdom of the country doctor, as it were -- is obscured and finally lost? Can either choice -- the presence of guilt, or the absence of wisdom -- lead to the good and virtuous life?
Dennett proposes no answers. But he deserves praise for courageously raising questions which, in the heady and lucrative race to build ever-larger information systems, few have dared to bring up.