Much has been written in the past couple of years regarding search engine companies' responsibilities associated with the delivery of content to the Internet community. Google in particular has been the focus of much of this attention owing to its domestic stand against a federal subpoena to deliver search records associated with underage access to pornographic materials (Mohammed, 2006) and its international decision to filter content delivered to Internet citizens in the Peoples Republic of China (Andrew McLaughlin, 2006).
Google's domestic battle with the Department of Justice has been hailed by some as a stand for privacy rights (Ingram, 2006) (when in fact Google attempted to justify its refusal to comply with the subpoena by citing the need to protect trade secrets (Rosmarin, 2006) (Baker, 2006)) while others criticized Google for standing in the way of an investigation into children and pornography (Sandoval, 2006). Google's decision to comply with the filtering requirements of the government of the People's Republic of China in order to gain access to the Chinese market is considered a justifiable business decision by some (Thompson, 2006) and ideological relativism by others (BBC News, 2006) (probably correct in both instances).
More recently, Google was sued by a New York legislator who claimed that child pornography is an "obscenely profitable and integral part" of its business (Broache, 2006). Although the suit was later dropped , child pornography is a unique issue, both socially and legally, which calls into question the policies and practices of Google an (Broache, Politician Drops Child Porn Suit Against Google, 2006) other search engine and Internet index companies, particularly given Google's demonstrated content filtering capabilities in the Chinese Internet space.
Google, and most other similar entities, currently are passive and reactive in their policies and practices associated with child pornographic material. This is not good enough. Given the demonstrated effectiveness of current filtering technologies (the problems associated with Google's opt-in SearchSafe not withstanding (McCullagh, 2004)) and the unique and unambiguous legal status of child pornography, it is quite possible that by not taking an active and proactive role in filtering child pornography from all search results, Google is guilty of aiding and abetting child pornographers and pedophiles in the commission of illegal acts, and may in fact be guilty of possessing and distributing child pornography (it is well documented that Google's database contains child pornographic images (Foley, 2006)).
This paper examines current child pornography law, relates the practices of Google (and other search engines/indexes) to current law, makes a case for the position that Google's current policies and practices place Google in violation of federal law, identifies and discusses social and technical issues and solutions, and suggests policies and practices that would put Google in compliance as well as address social concerns regarding privacy and law enforcement.
Child pornography is defined in 47 U.S.C. 2256 as "any visual depiction, including any photograph, film, video, picture, or computer or computer-generated image or picture, whether made or produced by electronic, mechanical, or other means, of sexually explicit conduct, where ... such visual depiction involves the use of a minor...or appears to be, of a minor engaging in sexually explicit conduct (1)". Further, Unites States law makes the knowing distribution (by any means including via computer networks), reproduction, receipt, sale, and/or possession of child pornography a crime (18 U.S.C. 2252).
Since the Internet is a primary conduit for the sale and distribution of child pornography (Foley, Technology and the Fight Against Child Porn, 2005) (Janis Wolak, 2005), 42 U.S.C. 13032 requires electronic communication services to report incidents of child pornographic activity to the National Center for Missing and Exploited Children as soon as they become "aware" of them Google reactively reports such activity once it has been reported to them by concerned Internet citizens, but doesn't appear to actively look for such activity in a proactive sense. …