In Search of Shaky Statistics

In grade school, students learn that there are two kinds of numbers: positive and negative. In David Murray's Washington, there are also two kinds of numbers: useful and not useful.

Mr. Murray is the independent prosecutor of science journalism, a numerical watchdog whose Washington D.C.-based research organization, Statistical Assessment Service (STATS), churns out Vital STATS, a monthly newsletter that points up the errors in popular-science reportage and tries to educate journalists about research methods to avoid those mistakes.

Science, says Murray, research director for the organization, is the darling of policymakers, but for the wrong reasons. "It gets them off the hook," he says, allowing them to feign helplessness in the face of overwhelming data. Problem is, for every PhD there's an equal and opposite PhD, and journalists are ill-equipped to sniff out insufficient data or claims tainted with agenda.

Moreover, he says, the culture of science is generally more cautious than that of the headline hungry media, where asteroids and Armageddon play better than wait-and-see ambiguity.

Since its genesis three years ago, STATS has seen industrial average-style growth. Murray's newsletter now reaches some 1,000 journalists, politicians, and policymakers. "We're now at the point where journalists call us" for help with stories, he says. "We're now more recognized."

Murray has no special background in statistics beyond the usual PhD regime, but says the polymath training of anthropology gives him a stream of knowledge that's "a mile wide, if not an inch deep." Enough to know when a study smells funny.

The organization's first principle is that policy based on sound science is critical to a democratic society. In its absence, technogogues dictate public decisions through their monopoly (legitimate or not) on scientific literacy, rather than engaging in healthy debate over data. "The mystique of numeracy widens the eyes of policymakers, but at the same time, fewer and fewer people actually command it," Murray says. "That seems to me to be a threat to self-governance."

There's an element of protectionism in STATS' philosophy, too, a sense that science is fragile and needs defending. Forces like religious fundamentalism, political correctness, and bad math not only contribute to the public's confusion, but erode the value of science as a method of inquiry, Murray argues.

Despite receiving much of its money from two conservative foundations, Murray says STATS has no political ax to grind. The group receives no industry or government support, either.

All it takes to fall under STATS's sword is a misstep with numbers, a pseudoscientific statement, a definitive where demure should apply. The last is STATS's favorite target, and most of the newsletter is devoted to instances, either direct or indirect, where reporters present shaky data as if it were oak-solid.

"Most reporters are, lets face it, incompetent with statistics," says Boyce Rensberger, editor of The Washington Post's Horizon section, and a subscriber to Vital STATS. "Many reporters and editors are easily hoodwinked by claims that sound scientific but are not very valid from a statistical or methodological point of view," says Rensberger.

Some of Vital STATS's more recent bugbears include:

* The willingness of mainstream reporters to accept overly aggressive claims about the success of needle-exchange programs. The media helped "ratchet up" vague results, Murray says, until they became magically definitive.

* The confusion and nearly uniform lack of accurate reporting about the sperm-count controversy (namely, whether it is declining has yet to be determined, despite widespread reporting to the contrary based on incomplete studies).

At its best, says Murray, STATS serves as a second opinion for reporters before their articles go to print, an objective opinion on research whose source may not be so detached. While that's happening to a degree, and more so recently, Vital STATS continues to fill its columns with scientific illiteracy: bungled graphs, preposterous pie charts, laughably ignorant headlines, and articles based on shoddy research that could have been downplayed or even ignored.

"Sometimes," Murray says, "our best work is when a story doesn't appear."

We want to hear, did we miss an angle we should have covered? Should we come back to this topic? Or just give us a rating for this story. We want to hear from you.