During the "Green Revolution" in 2009, the Iranian military posted photos from the protests on a website and invited citizens to identify twenty individual faces that were singled out in those photos. (1) They claimed to have arrested at least two of the individuals in the photos shortly after the protests. (2) According to some sources, the Iranian government tried to use face recognition technology to identify protesters, though its technology was still under development. (3) Imagine if the government could simply match these faces against the hundreds of billions of photos available on Facebook. The matches could reveal not only the protesters' names, (4) but also their whereabouts, their contacts, their online conversations with other protesters, and potentially their future plans.
Faces are particularly good for identification purposes because they are distinctive and, in most cases, publicly visible. other personal features that are in plain sight--like coats and haircuts--can easily be replaced, but significantly altering a face to make it unrecognizable is difficult. And yet most people can remain anonymous, even in public, because they have only a limited set of acquaintances that can recognize them. The use of face recognition technology in social networks shifts this paradigm. It can connect an otherwise anonymous face not only to a name--of which there can be several--but also to all the information in a social network profile.
Given the risks of face recognition technology when combined with the vast amount of personal information aggregated in social networks, this Article presents two central ideas. First, applying Professor Helen Nissenbaum's theory of contextual integrity, (5) I argue that face recognition technology in social networks needs to be carefully regulated because it transforms the information that users share (e.g., it transforms a simple photo into biometric data that automatically identifies users) and provides this personally identifying information to new recipients beyond the user's control. Second, I identify the deficiencies in the current law and argue that law alone cannot solve this problem. A blanket prohibition on automatic face recognition in social networks would stifle the development of these technologies, which are useful in their own right. At the same time, our traditional privacy framework of notice and consent cannot protect users who do not understand the automatic face recognition process and recklessly continue sharing their personal information due to strong network effects. Instead, I propose a multifaceted solution aimed at lowering the costs of switching between social networks and providing users with better information about how their data is used. (6) My argument is that once users are truly free to switch networks, they will be able to exercise their choice to demand that social networks respect their privacy expectations.
In Part II, this Article begins with a general overview of face recognition technology and how it is implemented on Facebook. Part III of the Article uses the theory of contextual integrity to examine how social networks may violate user privacy when they apply face recognition technology to user photos. It also explains why more traditional privacy theories--epitomized by Warren and Brandeis' right to be let alone--cannot address this problem because they are mostly concerned with the privacy of physical spaces and confidential information. Having identified how face recognition technology violates privacy in this context, Part III explains why a complete prohibition of face recognition technology or related data processing could prevent the development of useful technologies. This sets the stage for my multifaceted proposal. Part IV reviews current laws that could potentially apply to this problem and concludes that they do not offer sufficient privacy protection. Finally, Part V outlines a combination of legal, architectural, market, and norm-driven solutions that I believe could offer adequate privacy protection without unduly stifling innovation. …