Nanotechnology, Privacy and Shifting Social Conventions
MacDonald, Chris, Health Law Review
Nanotechnology promises (or perhaps threatens) to change the way we live. Like other novel technologies, nanotechnology will allow us to do new things, and so will present us with new choices. Importantly, nanotechnology may also influence the very values according to which we will make those new choices. In general, new technologies--even radically new ones--evolve within a more or less stable framework of conventional values, and the apparent novelty of any given technology doesn't automatically warrant skepticism about those values. So new technology doesn't warrant radically new approaches to ethics. (1) But none the less, all technologies--and especially paradigm-bending technologies like nanotechnology--have the ability to shape our values. This warrants careful thought.
The nano-technological application to be explored in this paper is surveillance technology, and the specific values to be discussed are values related to privacy. Privacy, according to Lessig, is to be understood as an ideal that stands in competition with the ideas of monitoring and searching. (2) That is, the less one's life is monitored, and the less one's life is subject to being searched, the more privacy one has.
A number of technologies being developed, or envisioned, within the broad category of nanotechnology have significant implications for the extent to which individuals are subject to monitoring and search. Technologies currently being developed or refined, including "smart dust" (3) and RFID (Radio Frequency Identification) tags (4) are already posing challenges to privacy, to say nothing of the challenges that would be posed if we one day see inexpensive video cameras "with the size and aerodynamic characteristics of a mosquito." (5) Further, one of the less controversial predictions about nanotechnology is that it will lead to important breakthroughs in computer technology, breakthroughs that will help computer manufacturers break past what is otherwise expected to be the end of current yearly increases in computing power. (6)
Nanotechnology thus means the potential for significantly increased processing power--the kind of processing power that would make it feasible for individuals, corporations, and governments to process the massive quantities of data that can already be gathered by traditional surveillance equipment such as security cameras. As things stand, we have a certain degree of privacy even when in front of a surveillance camera; without powerful biometric software and databases capable of storing and comparing the face in front of Bank Camera A with the face in front of Airport Camera Z, a face is just a face. So if nanotechnology makes good on the promise of significantly improved computational capacity, this too will have an effect on privacy.
Such, then, is the description of the possibilities inherent in nanotechnology for altering the availability of privacy. If our evaluation of the ethical dimension of this aspect of nanotechnology is to proceed in way that depends less upon invocations of gut reactions than have most debates in biotechnology, we need to bring to bear some theoretical tools.
2. Theoretical Framework: Ethical Conventionalism
Theorists should make clear the theoretical underpinnings of their conclusions, if they wish to avoid giving the impression that they are merely moralizing. Thus I will next make explicit, and take some time to explain, one simple theoretical framework that may help us better understand the shifts in privacy-related values that may accompany the coming of nanotechnology.
My theoretical framework is "ethical conventionalism," or the view that ethical values, standards and principles should be understood in terms of social conventions. According to this view, ethics is about informal, tacit social "agreements" to act in certain ways, agreements that typically evolve in response to particular characteristics of our environment. Technically, social conventions may be understood as regularities of behaviour, maintained through a shared interest in coordination and an expectation that others will do their part. (7)
The term "social convention" includes both strictly coordinative conventions and conventions for behaviour in more complicated situations. Purely coordinative conventions include ones that establish shared answers to questions such as, "Which side of the road shall we drive on?" and "Which hand shall we shake with?" and "What word will we use to designate that kind of plant there?" In such cases, we typically don't care what answer is chosen, so long as we all choose the same answer: it really doesn't matter whether we drive on the left or on the right, so long as we all do the same thing.
The term "social conventions" importantly also includes conventions that establish appropriate behaviour in situations in which our motives are likely to be more complicated. The conventions of war (including the Geneva Conventions) are a good example. All countries want to win when forced to go to war. That is one motive. But countries have also recognized that, regardless of who wins or loses, there are more and less civilized ways to conduct a war. (8) They thus would all prefer that their civilians, hospitals and medical personnel be exempt from attack, and that their soldiers, if taken prisoner, be treated humanely. This constitutes a second motive. It is important to distinguish between these two motives, not as a matter of accounting, but because they point in different directions: the desire to win points to a no-holds-barred strategy, and the desire for a 'civilized' war points to a more limited and constrained form of warfare. Tragically, the more brutal motive will often win out, absent some well-established convention requiring something different. Ignatieff points out that although violations of the Geneva Conventions are not unknown, the power of the Conventions can be seen in the persistence of the perceived need to provide special justification for such violations when they occur. (9) A more mundane example of a convention governing a "mixed motive" situation can be observed the next time you walk into a bank. Each customer prefers to get to a teller as soon as possible, but each is also motivated by an interest in finding an orderly way to adjudicate between the conflicting interests of customers as a group. This partial conflict is mediated, in many countries at least, by a strong social convention that demands queuing. (10)
It is crucial that conventionalism of the kind discussed here not be confused with naive cultural relativism, the view that a community's traditional standards cannot meaningfully be critically evaluated. Conventionalism, in the Human sense described here, is not nearly that conservative. Conventionalists see ethics as a piece of artifice; given that values and principles are artificial constructs (rather than handed down by divine command or written into the fabric of the universe), they are themselves pieces of technology that can be engineered to better suit human needs.
3. Social Conventions and Nanotechnology
If ethical rules--conventions--are reactions to our environment, then they must presumably be susceptible to change when that environment changes. For 21st century homo sapiens, our environment includes the technological context in which we live. Thus the answer to the question of what ethical rules we need depends, in part at least, on the technologies with which we live. The advent of nanotechnology may therefore affect the kinds of social conventions we need.
Others, of course, have noted that social conventions and values often follow, rather than lead, in their complicated dance with technology. Historian Daniel J. Kevles, for example, traces the history of developments in reproductive technology, and the way in which successive technological changes not only made possible what was once impossible, but also made reasonable what once seemed unthinkable. (11) Once upon a time, Kevles notes, in vitro fertilization (IVF) was condemned as a dangerous experimental technique with no therapeutic value, and as quasi- or proto-eugenic. Today, IVF is all but taken for granted; more than 150,000 "test-tube" babies have been born, and few would think to cast aspersions on the virtue of the parents or physicians of those babies. But the possibility of IVF--this technological advance--also opened the door for pre-implantation screening of embryos. Notice the dual implication, here, from an ethical point of view. Yes, the acceptability of IVF made it possible to develop pre-implantation screening; but it also rendered pre-implantation screening--and the abortions that might follow a positive result on a test for chromosomal abnormality--a little less unthinkable. In conventionalist terms, this history illustrates two technologies--one medical and one social--co-evolving. As reproductive technologies became more potent, social conventions about what kinds of reproduction would count as permissible became more lax.
What changes, then, might we see in our conventional values related to privacy, in an age of advanced nanotechnology?
Let us now apply our conventionalist framework to the effect that nanotechnology may have on privacy. In terms of privacy, each of us can act in such a way as to respect our neighbours' privacy either more or less. Our motives are complicated here, because while each of us benefits from cooperating to establish shared rules about privacy, each of us also benefits (let us say) from not being bound by those rules. Other things being equal, and absent any social conventions, we would prefer to at least be free to invade our neighbours' privacy when it suits us (whether for prurient reasons, out of curiosity, or even just benevolent nosiness). (12) But for the most part, we don't succumb to the temptation to pry and peek, and we may not (often) even feel the temptation, so well trained are we by the relevant social conventions.
The relevant conventions regarding privacy are conventions relating to how, when, and the extent to which we monitor and search each other. Among such conventions:
Knock before entering
Don't repeat things told to you in confidence
Avert your eyes when someone enters a computer password
Don't search through a neighbour's garbage
These may seem mundane--more a matter of etiquette than ethics. But think of the special significance those last two homely conventions take on in an electronic age, where both "eavesdropping" and "garbage" have new and complicated definitions. Here we begin to see already the important sense in which technology puts pressure on--either reinforcing or corroding--existing values.
Finally, we must ask, what role might nanotechnology play vis-a-vis our privacy-related behaviour? What difference might be made by the kind of advanced surveillance technology that is now foreseeable in an age of nanotechnology? In general, technologies allow us to do new things; and sometimes we forbear from doing things only because we lack the relevant enabling technology. Notice that the choice not to invade your neighbour's privacy is easier when the means to invade her privacy are ineffectual, rare or expensive. The advantage to be had from invasion of privacy depends in part upon the costs of carrying it out, and upon the quality of the information gained through such an invasion. Cheap, high-quality, unobtrusive surveillance equipment of the kind promised by nanotechnology is likely to lower the costs, and increase the benefits, of invading other people's privacy. We can reasonably expect that the availability of such technology will make it harder to maintain current privacy conventions. You and your neighbours may thus become tempted to shift from a pattern of behaviour under which you both respect each other's privacy to a pattern under which you both invade each other's privacy. After all, you're both likely at least to be tempted to eavesdrop or sneak a peek, once in a while; and besides (or so you may reason), if your neighbour is likely snooping, why shouldn't you too? Nanotechnology, then, may work to corrode extant social conventions--ethically useful social standards--associated with privacy.
Some will argue that there may be a way out of the predicament that the development of nanotech-enhanced surveillance equipment may put us into. Given that the mutually invasive pattern of behaviour is clearly bad (given current values), perhaps we'll find a means--technological, legislative, or social--to dig our way out. But notice one further possibility. One additional way "out" of the mutually disadvantageous outcome is not to change the outcome at all, but instead to change how we feel about it. That is, widespread nano-enabled privacy invasion is (on the conventionalist view) only bad if we think it's bad. If only we weren't so committed to the value of privacy, we wouldn't feel badly about a world in which people used nanotechnology to invade each other's privacy. Under that sort of pressure, it seems plausible enough that bit-by-bit, perhaps without even realizing it, we would come to accept the lack of privacy. If this happens, technology will have changed not just our options or our behaviour, but our values too.
Nothing offered in the preceding section suggests that there is anything bad about the long-run changes that nanotechnology may bring about in our options, our behaviour, and our values. Indeed, the conventionalist point of view rests upon the assumption that values, as such, are neither good nor bad. After all, "good" and "bad" are themselves just value words. So perhaps if, in a world suffused with nano-enhanced surveillance technology, we shift from thinking that privacy is important to thinking that it's not, things won't be worse, just different.
Still, if we are to be even roughly instrumentalist in our understanding of ethics, we need to think carefully about what it is that our current values get done for us. Values may not be eternal, but neither are they entirely arbitrary: some make our lives go better, and some make our lives go worse. As Michael Mehta warns, "The wide-scale use of [nano-enhanced] surveillance equipment may create a society with lower levels of trust, less social capital and depressed civic engagement." (13) Such outcomes would be antithetical to nearly universal ideas of what makes life good, and are almost sure to remain undesirable, even in the face of changing values. We must think critically, then, about the interplay between values and technology, and apply to novel technologies such as nanotechnology not just our strongest moral intuitions, but our best moral-theoretical tools.
(1) Chris MacDonald, "Nanotech is Novel; the Ethical Issues Are Not" The Scientist 18:3 (16 February 2004) 8.
(2) Lawrence Lessig, "The Architecture of Privacy" (Paper presented at Taiwan Net Conference, Taipei, March, 1998), online: Berkman Centre for Internet and Society at Harvard Law School
(3) Rosemary Clandos, "Privacy May Be Blown Away Like 'Smart Dust' In The Wind" Small Times (16 August 2001).
(4) Mark Baard, "Balancing Utility With Privacy" Wired (21 October 2003), online: Wired
(5) David D. Friedman, Future Imperfect (2003) [draft], online: Patri's World
(6) Eric W. Pfeiffer, "Breakin' The Law: Without Nano, Moore Is No More, Experts Say" smalltimes (15 May 2002), online: smalltimes
(7) This is the view of convention espoused by David Hume. See especially Hume's Treatise of Human Nature (III.II.V).
(8) George Mavrodes, "Conventions and the Morality of War" (1975) 4 Philosophy and Public Affairs 117.
(9) Michael Ignatieff, The Warrior's Honor (Toronto: Viking, 1998).
(10) In technical terms, the "choice" faced by a warring nations or by bank customers is strategically isomorphic with the iterated form of that most famous of social dilemmas, the Prisoner's Dilemma.
(11) Daniel J. Kevles, "Cloning Can't Be Stopped" Technology Review (1 June 2002).
(12) This is not to say that we would all choose to invade our neighbours' privacy, merely that we prefer, in principle, fewer restrictions on our behaviour.
(13) Michael D. Mehta, "On Nano-Panopticism: A Sociological Perspective" Canadian Chemical News 5 (Nov/Dec 2002) 31.
Chris MacDonald, Assistant Professor, Department of Philosophy, Saint Mary's University, Halifax, Nova Scotia.…
Questia, a part of Gale, Cengage Learning. www.questia.com
Publication information: Article title: Nanotechnology, Privacy and Shifting Social Conventions. Contributors: MacDonald, Chris - Author. Journal title: Health Law Review. Volume: 12. Issue: 3 Publication date: Fall 2004. Page number: 37+. © 2009 Health Law Institute. COPYRIGHT 2004 Gale Group.
This material is protected by copyright and, with the exception of fair use, may not be further copied, distributed or transmitted in any form or by any means.