This fall, Jeffrey Pollack found himself caught in the middle of the fierce national debate about how to protect children in the uncharted, and sometimes dangerous, world of cyberspace. He ended up coming down on what he once thought was the wrong side.
The conservative congressional candidate from Oregon had advocated mandatory filters on the Internet in schools and libraries. It made common sense. You wouldn't allow a copy of "Penthouse" in the school library. Why let in cyberporn?
Then he heard that his own Web site had been accidentally blocked by Cyber Patrol, one of the nation's largest filtering companies. That prompted him to do some research.
"I found out how much power over our free speech we're giving the people at, say Cyber Patrol, to make these God-like decisions over what gets blocked and what doesn't," says Mr. Pollack, who lost his bid for Congress. "I've now rewritten my issue statement to say that parents should step up to the plate and accept responsibility."
Pollack's experience is indicative of the challenge the country now faces as it tries to create public policy in a digital era, using 20th-century common sense and logic. Digital technology has transformed the way information is disseminated and gathered.
That's created a new set of quandaries that are playing out from the halls of Congress, where legislation requiring mandatory filtering is pending, to classrooms and homes, where parents and teachers often have to cope with kids who are far more technically adept than themselves.
"We can't frame [our Internet policy] in quite the same way we would with other materials - like 'Penthouse' - where the simple act of making choices prevents the interaction," says Paul LeBlanc, president of Marlboro College in Vermont and an Internet policy expert. "More important is what we are teaching kids about their contact with inappropriate content on the Web, because whether it's at school, at home, or at a friend's house, it's going to happen."
Advocates of mandatory filtering, like Sen. John McCain (R) of Arizona, agree that parents are the first line of defense. But he insists they need all the help they can get. And while filtering software may not be perfect, with more than 40,000 porn sites on the Web that can be accessed by searching such innocuous words as "Barbie" or "girl," Mr. McCain says it's better than nothing.
"As we wire America's children to the Internet, we are inviting these dirt bags to prey upon our children in every classroom and library in America," he says. "Parents, taxpayers, deserve to have a realistic faith that this trust will not be betrayed."
Such sentiments fueled earlier efforts to enact laws that would protect children online. But each one has run smack into First Amendment free-speech concerns.
The Communications Decency Act of 1996, which prohibited posting indecent material on the Net, was ruled unconstitutional by the US Supreme Court. The Child Online Protection Act (COPA) of 1998, which banned posting content which is "harmful to minors" has been set aside, pending a court challenge.
McCain hopes that by limiting the scope to schools and libraries and tying the filtering requirement to the acceptance of federal funds, he can overcome the constitutional concerns.
But the opponents of mandatory filtering, which include the American Library Association and the American Civil Liberties Union (ACLU), are already poised to take this new proposal to court if it's passed. They argue that Internet-filtering software is still so primitive it often over-blocks, filtering out important sites, or underblocks, letting in some it shouldn't.
"There hasn't been a dramatic advance in the technology that stops it from overblocking," says Marvin Johnson of the ACLU.
A site called Peacefire.org - which advertises "It's Not a Crime to Be Smarter than Your Parents" - drives home the point that filtering technology remains fallible at best. The site, which was first put up by some tech-savvy teenagers in response to the 1996 Internet legislation, gives detailed instructions on how to disable most filtering software.
It regularly analyzes the programs and last month released a study that found that Cyber Patrol and SurfWatch, another leading software filter, each had error rates higher than 80 percent. In other words, of the sites blocked, 80 percent turned out not to be pornographic or obscene.
"The standards drawn up by blocking-software companies are pretty arbitrary," says Bennett Haselton, one of the site's founders.
This summer Mr. Haselton testified before the COPA commission, established by the 1998 legislation to determine the best way to protect children online. In its final report to Congress in October, the commission opted against recommending mandatory filtering and in favor of increasing education and industry efforts to improve child-protection technology.
And the marketplace appears to be reacting. An Elmira, N.Y., company called Exotrope has created an artificial-intelligence program that "scans web content on the fly for pornography."
But so far, the program has failed Peacefire's tests. The testers found that while the software did block pornography, it also blocked pictures of people's faces.
A Virginia company called CornerPost decided the critics were right: There's no way any software blocking program can be completely accurate.
So it designed a filter called Chaparone 2000 that automatically alerts parents and teachers if a child tries to break it. The software then automatically updates the list of dangerous sites, and gives parents the option of editing it to their preferences.
So far, Peacefire hasn't put it to the test. But CornerPost's Cholene Espinoza is ready for the challenge. "The only thing they can do is make us better," she says.
(c) Copyright 2000. The Christian Science Publishing Society