If momentous events have happened in the censorware area since last October, I must have missed them. Libraries struggle with CIPA decisions that must be made by this July. A couple of recent entries into the “filtering” arena claim to be more library-friendly than the traditional operations—not named here because I lack any first-hand experience.
ALA’s website has a good set of “useful sources” on CIPA, prepared by Nancy Kranich and posted September 5, 2003 (quite possibly updated since then). I should be grateful that she includes my CIPA special—but she gets my name wrong (my professional name is consistently Walt Crawford, not Walter, and the longer form has never appeared in association with this publication). That grump aside, it’s a long and useful set of links.
ALA also has a page of “Questions and answers on Children’s Internet Protection (CIPA) legislation,” including the fairly recent “Questions and answers on filter disabling under CIPA,” posted December 3, 2003 and prepared by Thomas M. Susman of Ropes & Gray LLP. (The page also includes a November set of “CIPA questions and answers arising under the LSTA” and links to a PDF Q&A revised shortly after the Supreme Court decision. Susman offers two scenarios for appropriate disabling, with analytical commentary on each:
Ø The first scenario has all PCs filtered, but with an on-screen option that asks adult patrons whether they want filtered or unfiltered access, with a warning that by requesting unfiltered access the adult agrees to use the Web for legitimate purposes—and with a signed Acceptable Use Policy from the patron stating they want unfiltered access. Clicking on the option is all it takes to disable the filter. Does this scenario comply?
Susman asserts that it does, based on any common sense reading of CIPA and the Supreme Court decision, although a stickler could assert that direct disabling by an administrator is required. Assuming that safeguards are in place to assure that only adults use this option, this scenario should be OK. (That’s a relief, since this is the scenario I suggested last summer.)
Ø One bank of PCs in the library has filters present but not active. The PCs are labeled for use by adults only, and library staff monitors use on a regular basis. Does this scenario comply?
Surprisingly, Susman also argues that this option appears reasonable under CIPA, as long as sufficient safeguards are in place. It’s worth noting that Susman is saying that both options appear reasonable. He is not offering a legal opinion or a recommendation. For some reason, I find the second option a little dicey—but I’m pleased that an actual lawyer is more adventurous.
Derek Hansen (University of Michigan) wrote “CIPA: Which filtering software to use?” posted September 9, 2003 at WebJunction (www.webjunction.org). He notes some key variables when selecting software and that most programs are flexible—although few (if any, at least at the time) include categories specific enough for CIPA. To some extent, Hansen (who participated in the Kaiser study) seems to interpret situations in the most favorable light for censorware makers—for example, his point that even a large percentage of overblocking will still mean most patrons won’t encounter an erroneous block. I have trouble with—and disagree with—Hansen’s sunny conclusion:
Filters are a bit like children. They come in all shapes and sizes. They don’t always do what they are told, although they generally get it right. They are at their best when they are they are taught to use all of their capabilities. And at times they require some discipline. In short, they’ll never be perfect, but they can be influenced to reach their potential.
The Free Expression Policy Project (www.fepproject. org) offers a “Fact sheet on internet filters” that’s a good deal less upbeat about censorware. The version I most recently downloaded is dated September 26, 2003; there may have been changes. This 8-page listing (which includes almost three pages of footnotes) includes a good brief history of censorware, how filters operate, and specific notes on CIPA and the Supreme Court decision. It’s fair to say FEPP’s six-point summary of “the major problems with internet filters” doesn’t quite match Hansen’s conclusion. Briefly, the FEPP says filters operate as prior restraints on expression, reflect a reductive view of human expression, set up barriers and taboos rather than educating youth, frustrate and restrict research in many areas, replace professional judgment with secret decisions made by private companies, and exacerbate the “digital divide” by restricting access for students who don’t have home internet access.
There’s a library filtering table/spreadsheet at filtering.galecia.com that’s worth a look. I assume the author is Lori Bowen Ayre, since her weblog (noted below) resides here. The table offers information on eight different products. I printed off some pages (with difficulty) on October 29, 2003; as with everything else, contents may have changed. Warning: If you’re easily offended, example URLs in the table may set you off, specifically in the IF-2K column. I suspect the websites are a lot worse than the URLs, but those are pretty bad.
“Preserving the freedom to read in an era of internet filtering: Principles for the implementation of CIPA-mandated filtering in public libraries” is two pages long (one printed sheet) with some good advice. Fourteen principles are grouped in four categories:
Ø Tailored blocking: Blocking should be limited to CIPA-specific categories, a range of library-tailored software should be available, certain broad categories of content should be exempted altogether, and libraries should be able to add “white lists” (do-not-block sites) based on community needs.
Ø Right of adult users to avoid filtering: Adults should be able to have filters disabled “anonymously and without explanation”; libraries should provide clear, conspicuous information about disabling; adult users should be given access to an unfiltered computer without explanation and should be able to have the filter disabled at any time; and adults should “have a means to obtain unfiltered access that persists for a period of time, such as month or a year.”
Ø Transparency: Information about blocking should be available to users and communities—categories, lists of blocked sites, possible adjustments—and any blocking should be plainly indicated at the point of blocking.
Ø Privacy and anonymity: Users should be able to use the internet anonymously; sites visited should not be recorded by filtering software; and requests to unblock should not be recorded in a way that can be linked to the user.
“Public libraries and the Children’s Internet Protection Act (CIPA): Legal sources” was published January 19, 2004. This four-page list is divided into primary sources (CIPA itself, the decision, and related documents), secondary sources (a variety of commentaries including some cited here in previous issues and several I haven’t seen—and yes, Minow includes the CIPA Special), and two sources on related state laws. A solid list of resources, with hotlinks where feasible—and it’s worth noting that Minow includes Janet LaRue’s odd argument that the Supreme Court did not mandate unconditional disabling for adults.
“Why filters won’t protect children or adults,” Library Administration & Management 18:1 (Winter 2004): 14-18, is also available at ala.org. The article discusses the usual problem with filters and offers some recommendations. I believe this discussion overstates the case against censorware—something I never thought I’d say. Consider these sentences:
Only about 1.5 percent of Internet sites are considered pornographic, and of those, the best filters block about 75 percent when set at the highest levels. At the same time, filters block at least 20 percent of the three billion benign Web sites—a whopping 600 million-plus sites.
I find those numbers unbelievable. I haven’t seen all the studies of censorware. Perhaps there is one that finds that the best program really does block only 75% of pornographic sites at its strictest settings, but that’s far below the effectiveness I’ve usually seen. And an overblocking rate amounting to 20% of all “benign” web sites, stated as a minimum, seems way out of line with the studies I have seen. The Kaiser study, for what it’s worth, found an average overblocking of 1.4% of health sites (but 10% of the more controversial sites) at censorware’s least restrictive settings, combined with 13% underblocking at those settings (that is, blocking 87% of porn sites). Those numbers are averages; some software did better.
There’s also the gotcha: CIPA only requires blocking images and does not restrict pornographic sites (which are constitutionally protected speech for adults) in general. Whether or not any filtering programs tested in previous studies could be set to block images and pass text, such programs are now available to libraries (as are programs where customers may obtain unencrypted lists of blocked sites and programs that attempt to establish categories directly related to CIPA). Here and elsewhere, this discussion offers too many generalizations that are falsifiable at the moment.
Karen Schneider notes another issue: Kranich does not address the problem of ALA’s age-neutral policy. “I disagree with this policy both strategically and philosophically, and I believe it is this issue that truly divides the ALA governing bodies from the ALA membership and the public at large.” Karen goes on to say:
When Kranich isn’t attempting to argue for ALA’s age-neutral policy, she does an excellent job of underscoring something I have said since 1996: filters don’t work. Most adults don’t need them; no one, hearing how filters actually function, really wants to be filtered. (Some people want others to be filtered, but that’s a natural human tendency.) Most adults behave responsibly in libraries, and those that don’t should be dealt with through policy and procedure.
Filters don’t work. I agree. I don’t believe censorware can work in a manner appropriate for adults and teenagers in a library setting—that is, block 100% of legally inappropriate material (somehow dealing with the absurdity of treating youth aged 12 to 16 as though they’re children) while passing 100% of legal material. I doubt very much that censorware can even handle access by children (which I’d probably define as kids under 10, and that may be too broad) in a sufficiently sensitive manner. The problems are real. There should be no need to overstate the failures of censorware; they’re sufficient as they are.
Yes, it’s a dumb heading, but I don’t know how else to cluster these weblog entries and other relatively brief and usually informal items.
A November 12, 2003 posting, “Filtering: The low-down truth,” clearly reflects a lot of thinking on Karen’s part. Since the CIPA decision, she’s been asked to write, present, and help libraries make choices—and she’s decided to turn down those requests. “My best advice hasn’t changed in seven years. Filters are bad news.” She feels—correctly, I believe—that offering to advise on the best filter implementation would imply an endorsement of the concept that filtering is a good thing. It’s not. Karen is no more an absolutist on access than I am, as an American Libraries piece made clear, but she’s studied censorware long enough and carefully enough to understand that they almost inherently block access to constitutionally protected speech.
Two days later, she posted “Educating CIPA,” a “top-ten list about CIPA and filtering.” Go read the post (frl.bluehighways.com/frlarchives/000108.html) if you’re vague on the basics: She distills a lot of information into ten brief paragraphs. It is worth noting, as Jay Currie does in a comment on the post, that point 6 (“Filters hide blocked sites in encrypted lists…”) may not be universally true: IF-2K will supply unencrypted lists to customers. The list begins and ends with crucial points: “1. Filters block Constitutionally-protected speech. This is a fact not disputed in the CIPA decision.” “10. It may seem that every library in the world is filtering, but that’s not the case at all. Many libraries have chosen not to filter… We don’t hear about those libraries because staying low-profile is a strategy.”
“No strings on me: Librarians fight filters” appeared in the November 23 issue and discusses at least one library that is not taking such a low profile. San Jose Public Libraries is ready to give up $20,000 in e-rate subsidies rather than block access—a stand supported by the city council. (As noted, $20K is only about 0.01% of San Jose’s budget.) At the other extreme, Bob Watson (Franklin Park, IL) says that library would filter even without CIPA’s mandate after witnessing the Minneapolis “hostile workplace” battle. Which raises an interesting issue for Franklin Park: Unless those filters can be defeated at any adult patron’s request, which restores the so-called hostile workplace, isn’t there reason to believe that a patron could mount a suit against the library?
Ayre’s contribution, “Breaking the law to comply with CIPA,” is a full-page blog posting that originated no later than January 18, 2004. “I woke up realizing that there is no way to strictly comply with CIPA without breaking the law.” Why? Because a true CIPA filter would block only visual material that’s either child pornography, obscene, or harmful to minors. She should start creating such a list for Squidgard users, making a true CIPA filter feasible. “Then the voice of Mary Minow entered my brain saying “It is illegal to view—even for research purposes—child pornography, in any form.” So if she compiles a list, she could be arrested for doing so.
First Catch-22: Libraries and schools attempting to compile lists of illegal material are violating the law. Second Catch-22: Commercial products also block constitutionally protected speech, putting the library at risk of First Amendment suits. “We really have no option to create a true CIPA block list. We are forbidden by law from compiling it.” Similarly, any commercial company could be arrested (or, rather, its officers could be—the “company as person” oddity in American law only seems to work to the benefit of corporations) for viewing child pornography. “Therefore, it is impossible to strictly comply with CIPA without breaking the law. Wouldn’t that be the definition of bad law?”
As you’ve already read here, Finkelstein has given up his censorware/DMCA research for a variety of reasons. Meanwhile, he continues to comment.
On October 29, he posted a message to Web4Lib and elsewhere (originating with his Infothought mailing list) celebrating the renewal of the DMCA exemption for studying censorware. “The Register of Copyrights has attributed that exemption primarily to me!” That’s true. To quote from the Register’s recommendation: “The Register’s recommendation in favor of this exemption is based primarily on the evidence introduced in the comments and testimony by one person, Seth Finkelstein, a non-lawyer participating on his own behalf.” There’s more, but that’s the key.
A bit earlier on the Infothought weblog, Finkelstein noted the acquisition of N2H2 by Secure Computing, makers of SmartFilter, as “the end of an era (in many ways).” N2H2 and David Burt have been long-time issues with Finkelstein, who documented the pathetic financial state of the company in recent years. The personal significance of the acquisition is that “there’s even more money and resources available for a lawsuit against me.”
On January 14, 2004, Finkelstein commented on CDT’s set of principles (noted above), calling it “mostly a long series of wishful thinking and unrealistic assertions.” It’s hard to argue with his comment on the first proposed principle, “Blocking should be limited to the categories of adult content specifically set out in the CIPA statute”:
Well, I should be granted a million dollars, an A-list blog, and a professorship. It’s not going to happen. The categories set out in the CIPA law are legal categories such as obscenity. No censorware company creates such a minimal blacklist, because these are complex legal determinations.
Going back a month, while Infothought may never be an A-list blog (for what that’s worth), Finkelstein did get a nice interview at GrepLaw.org from Harvard’s Berkman Center (posted December 16, 2003). My printed version runs 22 pages. That’s large type in a relatively narrow column, but this is a long interview. It’s also well worth reading, particularly if your view of Finkelstein is limited to my comments or is colored by either slashdot or the nastiness on Larry Lessig’s weblog when Finkelstein said something impolitic (and, in my view, absolutely correct) about somebody who’s more of a Big Name than he is. Finkelstein’s thoughtful, clear answers to some difficult questions say a lot about who he really is—and, of course, there are loads of links if you want to investigate further. I won’t summarize because that really wouldn’t work in this case—the detailed discussions stand on their own.
Do you believe there are more than 260 million “pornographic web pages”? And that this is up 1,800 percent from 1998? I don’t, but that’s what N2H2 said in a September 30 press release. “In the month of July alone, N2H2 identified web sites comprising over 28 million pages for its filtering database, and N2H2 now has identified over 260 million pages classified as pornography.”
The release includes the kind of quote you’d expect from the director of the “Center for On-Line Addiction”: “Pornography is becoming so pervasive on the Internet that it is now difficult to avoid unwanted exposure, and this makes cybersex addiction more likely, which can lead to a multitude of legal issues for organizations. There is a definite need for tools like filtering software to offer protection for those who want, or are required, to avoid this type of illicit material.” N2H2 is there to help—or was, before what was left of it was purchased.
By now, faithful readers should know that most “pornography” is not “illicit” and even the least computer-literate readers should be able to determine the likelihood that a company with a total of 70 employees or fewer was able to confirm the pornographic nature of 28 million pages during one month. If every employee did nothing other than look at web sites every minute of every day of 40-hour weeks, with no managers, salespeople, or support staff, that comes out to more than 2,270 websites per person per hour, or more than one every two seconds. But then, consider the other proof: “A search in Google on the word “porn” returned over 80 million pages.” Including, to be sure, quite a few issues of Cites & Insights, every commentary on CIPA, and millions of other pages that are not in the slightest bit pornographic, including N2H2’s press release (unless you consider absurd commercial claims to be a form of pornography).
Callister, T.A., Jr., and Nicholas C. Burbules, “Just give it to me straight: A case against filtering the Internet.” Downloaded September 15, 2003, from faculty.ed.uiuc. edu/burbules/ncb/papers/straight.html
Burbules is at the University of Illinois Urbana-Champaign, Callister is at Whitman College. This 16-page (double spaced) article argues that, while the courts have decided schools and libraries can be required to filter access, “with very few exceptions, they should not.” Both writers are professors of education and fathers of young children; both formerly worked as teachers (preschool and elementary school). Naturally, both “want our children to have the educational benefits of the Internet, but to be protected from what is harmful or dangerous, and this is what filters promise to do.”
This is censorware from a different perspective than either libraries or “family” groups. The authors “say up front that parents have every right to impose restrictions on what their own children view or do on the Internet at home, just as they have the right to limit what their children watch on television”—but go on to say that schools and libraries should expose students “to a broader horizon of ideas, experiences, and points of view.” That horizon may also be subject to restrictions—but not the way filters work:
They are indiscriminate, often arbitrary, and they remove decisions about what is and is not filtered from the domain of public deliberation, placing it in the hands of automated procedures and criteria developed by invisible and unaccountable programmers who, for commercial reasons, have a fundamental interest in erring on the side of filtering out more rather than less. From the standpoint of public education, this inevitably leads to abuses and anti-educational effects.
The authors go on to point out why censorware doesn’t “filter” in the beneficial sense; why filters serve more to protect teachers, school administrators, and company profits than to protect students; and some of the issues with the meaning of “harmful.” A good brief discussion of how censorware works is followed by detailed discussions for “six reasons not to filter…Internet access in school”:
Ø Filtering is anti-educational—both because it prevents students from accessing important and relevant material and because it sends an implicit message about the importance of obedience and acquiescence.
Ø Filtering software does not work as it is advertised. Some wonderful examples follow.
Ø Filtering is censorship—and, the authors argue, the evils that censorware attempts to halt aren’t nearly as overwhelming as advocates claim. (This is an excellent discussion, although pro-filtering groups will claim it’s wrong. “Our anecdotal experiences aside, according to [OCLC], ‘adult content’ exists on only an extremely small proportion of the Web… So, for those easily offended, a bit of free advice: If a link says, in large, flashing, red, capital letters: CLICK HERE TO SEE HOT TEEN SEX, and you don’t want to see hot teen sex—don’t click! The chances of a young person who is not looking for such material finding it accidentally is negligible.” [Emphasis added.] (Aas they noted, some kids are eager to look for explicit material—and they’ll find it, filter or nor filter.)
Ø Filtering is deceptive, particularly when the filter doesn’t let the user know that a site has been censored or why.
Ø Filtering distorts, as it disrupts the relationships among ideas and beliefs.
Ø There are better solutions. “Attempting to restrict access to the wider Internet because a student might see a dirty picture is like closing libraries because some pervert once exposed himself in the stacks.”
Strongly recommended for a view from the “other CIPA community.”
Corn-Revere, Robert, “United States v. American Library Association: A missed opportunity for the Supreme Court to clarify application of First Amendment law to publicly funded expressive institutions,” Cato Supreme Court Review. (Downloaded in PDF form; I know it’s pages 105-130, but can’t say which volume.)
A brief note on this long article because I don’t know enough to provide intelligent comments. I certainly agree that the Supreme Court didn’t clarify much of anything about the First Amendment in their CIPA ruling. The issues discussed here are important. I suspect that Corn-Revere’s discussion is worth reading for some with a deeper knowledge of the law involved. I was charmed to see one particular comment:
The one remaining distinction [between this case and another in which the court struck down restrictions on speech], that public libraries—unlike legal aid lawyers—do not have a tradition or function of opposing the government simply is beside the point. It is not the mission of a public broadcasting station to oppose the government either, yet funding conditions designed to restrict editorial choice and content have been ruled unconstitutional.
I’d go a little further than that—and did, in the CIPA Special. Quoting that passage:
The declaration that public libraries have no such role is potentially chilling. I know of no good public library collection that does not include materials that challenge practices of the current administration—no matter which “current administration” you choose to name. I would argue that any healthy public library does challenge the Federal government within its active collection, almost by definition. Would it be legal for Congress to say that libraries receiving Federal funds must not collect books and other resources that take issue with the current administration?
Sobel, David L., “Internet filters and public libraries,” First Reports 4:2 (October 2003). A First Amendment Center publication; 17 pages (including two pages of notes). (www.firstamendmentcenter.org/PDF/Internetfilters.pdf)
Sobel is general counsel of EPIC, the Electronic Privacy Information Center. He was co-counsel on Reno v ACLU, the successful challenge to the Communications Decency Act. This paper offers a clear, brief historical background on censorware, the drive for mandated use, and challenges to those mandates. It’s a good overview covering more than the immediate situation; worth reading and worth recommending to others who don’t understand the issues.
All original material in this work is licensed under the Creative Commons Attribution-NonCommercial License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc/1.0 or send a letter to Creative Commons, 559 Nathan Abbott Way, Stanford, California 94305, USA.