RESPONSES TO 10 ARGUMENTS AGAINST THE USE OF INTERNET FILTERS IN LIBRARIES
We have all heard about the dangers of Internet filtering software packages from the antifiltering activists. But even in a unanimous Supreme Court decision calling the Communications Decency Act unconstitutional (see p. 11, this issue), the justices left open the question of the appropriateness of filtering children's access in libraries. The antifiltering arguments are often full of distortions, half-truths, and poor logic. What follows are responses to the 10 most common arguments against filters.
Argument 1: All filters rely on keyword blocking. Filters block out all sites that contain words like "breast," thereby blocking out all information about breast cancer.
It's true that all filters have keyword blocking. But those who oppose filtering often leave out a key fact when they make this argument: The better filtering products allow you to turn the keyword blocking off. Every library that filters that I know of turns off the keyword blocking and instead relies on site-selected blocking, which involves a preselected stop list of banned sites chosen by employees of the software company.
Argument 2: Filters routinely block sites relating to sex education, AIDS information, gay rights, and abortion.
It's true that most filters do block some of these things. But here again another key fact is left out: The better filters break down banned sites by subject. Cyber Patrol, for example has about a dozen categories such as "controversial," "sex education," and "full nudity." When set up so that only pornographic sites are blocked, it does a good, though not perfect, job of blocking out only pornography. And the better filters, like Cyber Patrol, allow you to unblock a site that has been incorrectly blocked. The selection of blocked sites is not perfect, but it is rapidly improving.
Argument 3: Filtering involves turning over the selection of materials to outsiders.
Yes, this is true, but it's also nothing new to libraries either. Librarians have relied on vendors to preselect for them for years. Buying books on an approval plan or 300 full-text magazines on a CD-ROM also involves letting a vendor do a certain amount of selection for the librarian. There's nothing wrong with that.
Argument 4: Filters rely on a secret stop list of banned sites that can't be viewed or changed.
This is a complicated topic since the different filters handle this differently. Net Nanny, for example, allows you to view the stop list and to change it. Cyber Patrol allows you to change the stop list but not to view it. The vendor's reluctance to publish their stop list is understandable: It's what really makes the software valuable, and the vendor is naturally reluctant to give this information away to competitors. It is preferable that a filter's stop list be viewable, but a stop list that is editable and reasonably accurate is acceptable as well.
Argument 5: By filtering, a library can be viewed as a "publisher" of a Web site and could then be held responsible for its content. Prodigy was found liable for this reason in a court case.
True, Prodigy was found liable in a court case because they exercised some editorial control over the material they "published." The antifiltering activists love to cite this example but they somehow always seem to forget to mention the Cox-Wyden amendment to the Communications Act. The Cox-Wyden amendment establishes protection for "Good Samaritan" blocking and screening of offensive material. Specifically, the bill establishes that no provider or user of interactive computer services will be held liable for: 1) actions taken voluntarily in good faith to restrict access to material that the provider or user considers to be obscene, lascivious, filthy, excessively violent, harassing, or otherwise objectionable; or 2) actions taken to provide the technical means to restrict access to such material. …