Your beliefs vs. the facts
Bias and self-deception are fierce foes of science. That's why evidence-based debate is so vital.
Twenty years ago, as a college freshman, I knew precisely what it meant to be scientifically literate. In fact, I held an objective measure in the palm of my hand, courtesy of E.D. Hirsch. His book, "Cultural Literacy: What Every American Needs to Know," was a bestselling paperback, and conveniently listed thousands of names, terms, and phrases with which every educated person – he informed us – should be familiar.Skip to next paragraph
Subscribe Today to the Monitor
After plodding through the entire list during the course of an afternoon, I smugly discovered I could easily define each item of scientific vocabulary. Fuzziness about literary examples such as "Aeschylus" caused me no discomfort, but inability to rigorously describe "aerobic respiration" in the biochemical sense (not the superficial, then-popular Jane Fonda sense) would have induced severe nerdish embarrassment.
The wrong kind of scientific literacy
Today I teach science and its history at an honors college and am naturally far less confident about how to measure scientific literacy. The students who enter our program possess not only the expected high SAT scores, but also perfect or near-perfect scores on a battery of Advanced Placement exams, particularly in the basic sciences.
A noticeable portion of those students also believe in the literal truth of certain ancient accounts of Earth's history that, to put it bluntly, directly contradict mountains of well-established data from geology, climatology, and biology. Without rehashing the ongoing culture wars surrounding this topic (and certainly without berating my own students), this serves as a useful place to begin tackling the notion of "scientific literacy."
We frequently hear the refrain that if America simply raised the level of science courses, taught our children more subjects, and/or gave them more hands-on lab work, it could ensure the production of a citizenry capable of understanding an increasingly complex world. They would then be prepared to make the difficult choices of the 21st century.
However, my incoming students' technical mastery already exceeds what even the most rosy-eyed optimist could realistically dream for America (or the globe) as a whole. In other words, even if a citizenry were to achieve an impressive degree of scientific literacy – construed as raw conceptual competence – it would still be entirely possible for those same citizens to routinely subordinate scientific evidence to their own deeply ingrained cultural suppositions.
More important, the phenomenon of "evidence blindness" is hardly restricted to inexperienced students, or even to ideological segments of the general population. To varying degrees, it can be found across the spectrum, including some very striking examples in the realm of professional science itself.
As noted last year in Seed magazine, leading disciplinary practitioners who feel threatened by unorthodox new findings will sometimes band together to suppress such information, with the explicit intention of blocking its appearance in scientific journals.
While these luminaries undoubtedly convince themselves they are merely upholding the integrity of their fields, the truth is that they (in quintessentially human fashion) are often more interested in preserving cherished beliefs than in encouraging potentially disruptive discoveries.
Over the past few decades, growing evidence from cognitive science has revealed significant limits on the ability of individuals to criticize their own viewpoints. Even the most analytically gifted and experienced among us are susceptible to bias and self-deception to an extent that we (fittingly enough) generally fail to appreciate.