NATURE'S mysteries are tricky enough without self-deception to mislead you. Yet even seasoned scientists sometimes fool themselves. They favor data that support their beliefs. They may jump to false conclusions, which they then defend tenaciously. In short, they can have as much trouble as laymen in fitting unwelcome phenomena into their personal world views.
Psychologist Michael J. Mahoney of the University of California (Santa Barbara) has experimented with these well-known pecadilloes of scientists. His findings take the gloss off the scientist's traditional image as a skeptical yet open-minded scholar.
As described in a recent report of Mahoney's work, three of these experiments highlight various aspects of the self-deception.
A set of three numbers -- 2,4,6 -- formed the basis of one test. Groups of psychologists, physical scientists, and Protestant ministers were asked to discover the rule according to which the numbers were organized by playing a kind of guessing game. They were to think up other sets of numbers and ask the examiner if these fit the rule.
The rule was simple -- rank three numbers in ascending order. The experiment was structured to show the thought processes by which different people searched for the right answer.
Mahoney found that the social and physical scientists jumped more quickly to a conclusion and clung to a wrong answer longer than did the ministers.
Another experiment involved 75 experts who referee research papers for publication in a social science journal. Mahoney's team sent them a fictitious paper. In some cases, the reported data and conclusions agreed with accepted theory. In other cases, they challenged that theory.
Mahoney found that the referees' beliefs biased their judgement. They praised the paper if it seemed to reinforce their own views, and criticized it and rejected it when their beliefs were challenged.
Finally, the Santa Barbara team sent a different group of referees papers with two different sets of footnotes. In some cases, these footnotes referred to other research papers by the manuscript's author -- papers which were listed as ``in press,'' awaiting publication. In other cases, the same references were listed under other names.
The referees consistently gave higher ratings to the paper that seemed to be based on other work by its author -- work that had already been accepted for publication elsewhere. Because they thought another scientific journal had already recognized the author, the referees more readily accepted his work for their own journal.
``Recognition begets even further recognition,'' Mahoney observes, even when the work at issue may not deserve it.
Mahoney is interested in learning more about the psychology of scientists and how it influences their work. What he has done so far only begins to explore this important aspect of the development of scientific knowledge.
But it already emphasizes a trap into which scientists often fall when they speak out on public issues. The human tendency toward self-deception can lead them to distort scientific knowledge for the sake of making a political point. For example, suspected or partially proven environmental dangers can be overstated or played down to an extent that neither existing data nor present theory warrants.
Advocates of tough regulation of coal-burning power plants as a source of acid rain, for example, have no trouble finding experts to back them up ``scientifically.'' Their opponents, meanwhile, have expert allies who insist more research is needed to pin down acid rain's cause.
The heat-trapping gas, carbon dioxide (CO2), released when we burn coal and oil, may warm Earth and change its climate. Recently, some scientists studying this possibility have warned that its influence will be felt more quickly than had been expected. Serious warming might come within this century, they say. It's time, they warn, to begin restricting our use of coal and oil.
Yet the computer simulations these scientists use are uncertain and are loaded with assumptions. The data they use are inadequate. Other scientists working with the same theories and data say much more research is needed to elucidate any climatic danger.
And so it goes. Where political, religious, or ideological issues are involved, scientists are as tempted as anyone to shade the facts to fit their position. Which is what you might expect. But, as Mahoney's work illustrates, scientists can also be unconsciously misleading even when they try to be unbiased.
So beware of scientists whose ``findings'' owe more to faith than to fact, especially when they espouse a social or political cause. They may only be fooling themselves.
Robert C. Cowen is the Monitor's natural science editor.