What Do We Mean by Privacy?

Much like the problem of saying “the Internet of Things” and then assuming that everyone knows what you are talking about, the term “privacy” means very different things to people. This is as true among experts, practitioners, and scholars as it is among general society. “Privacy is a concept in disarray,” observes Dan Solove, one of America’s leading privacy law scholars; it is “too complicated a concept to be boiled down to a single essence.”10 Privacy is an economic, political, legal, social, and cultural phenomenon, and is particular to countries, regions, societies, cultures, and legal traditions. This report briefly surveys American and European privacy ideas and mechanisms.

The Concept of Privacy in America and Europe

In 1890, two American legal theorists, Warren and Brandeis, conceived of the “right to be let alone” as a critical civil principle,11 a right to be protected. This begins the privacy legal discussion in the United States and is often referenced in European discussions of privacy, as well. Later, in 1967, privacy scholar Alan Westin identified “four basic states of privacy”:

Solitude
Physical separation from others
Intimacy
A “close, relaxed, and frank relationship between two or more individuals” that can arise from seclusion
Anonymity
Freedom from identification and surveillance in public places
Reserve
“The creation of a psychological barrier against unwanted intrusion”12

Westin wrote that privacy is “the claim of individuals, groups, or institutions to determine for themselves when, how, and to what extent information about them is communicated to others.”13 This view appears also in European conceptions of privacy. In 1983, a German Constitutional Court articulated a “right of informational self-determination,” which included “the authority of the individual to decide [for] himself...when and within what limits information about his private life should be communicated to others.”14 In Europe, privacy is conceived as a “fundamental right” that people are born with. European policies mainly use the term “data protection” rather than “privacy.” It’s a narrower concept, applied specifically to policies and rights that relate to organizations’ fair treatment of personal data and to good data governance. Privacy covers a broader array of topic areas and is concerned with interests beyond fairness, such as dignity, inappropriate surveillance, intrusions by the press, and others.

In 1960, American law scholar William Prosser distilled four types of harmful activities that privacy rights addressed:

  • Intrusion upon someone’s seclusion or solitude, or into her private affairs

  • Public disclosure of embarrassing private facts

  • Publicity which places someone in a false light

  • Appropriation of someone’s name or likeness for gain without her permission15

This conception of privacy is, by design, focused on harms that can befall someone, thereby giving courts a basis from which to redress them. But, as the preceding descriptions show, conceiving of privacy exclusively from the perspective of harms is too narrow.

Thinking of privacy harms tends to focus discussion on individuals. Privacy, however, must also be discussed in terms of society. Privacy and data protection, it is argued, are vital for the functioning of society and democracy. Two German academics, Hornung and Schnabel, assert:

...data protection is... a precondition for citizens’ unbiased participation in the political processes of the democratic constitutional state. [T]he right to informational self-determination is not only granted for the sake of the individual, but also in the interest of the public, to guarantee a free and democratic communication order.16

Similarly, Israeli law scholar Ruth Gavison wrote in 1980: “Privacy is...essential to democratic government because it fosters and encourages the moral autonomy of the citizen, a central requirement of a democracy.”17

In this way, privacy is “constitutive” of society,18 integrally tied to its health. Put another way, privacy laws can be seen as social policy, encouraging beneficial societal qualities and discouraging harmful ones.19

In trying to synthesize all of these views, Professor Solove created a taxonomy of privacy that yields four groups of potentially harmful activities:20

  • Information collection

    • Surveillance: watching, listening to, or recording an individual’s activities

    • Interrogation: questioning or probing for information

  • Information processing

    • Aggregation: combining various pieces of data about a person

    • Identification: linking information to particular individuals

    • Insecurity: carelessness in protecting stored information

    • Secondary use: use of information for a purpose other than what it was originally collected for without a person’s consent

    • Exclusion: failure to allow someone to know about data others have about him, and to participate in its handling and use

  • Information dissemination

    • Breach of confidentiality: breaking a promise to keep a person’s information confidential

    • Disclosure: revelation of information that affects how others judge someone

    • Exposure: revealing another’s nudity, grief, or bodily functions

    • Increased accessibility: amplifying the accessibility of information

    • Blackmail: the threat to disclose personal information

    • Appropriation: the use of someone’s identity to serve someone else’s interests

  • Distortion: disseminating false or misleading information about someone

  • Invasion

    • Intrusion: invading someone’s tranquillity or solitude

    • Decisional interference: incursion into someone’s decisions regarding her private affairs

Although this taxonomy is focused around the individual, it should be understood that personal losses of privacy add up to societal harms. One of these is commonly called chilling effects: if people feel like they are being surveilled, or that what they imagine to be private, intimate conversations or expressions are being monitored, recorded, or disseminated, they are less likely to say things that could be seen as deviating from established norms.21 This homogenization of speech and thought is contrary to liberty and democratic discourse, which relies upon a diversity of views. Dissent, unpopular opinions, and intellectual conflict are essential components of free societies—privacy helps to protect them.

One important thing to take away from this discussion is that there is no neat split between information people think of as public versus information that is private. In part, this is because there is no easy definition of either. Consider medical information shared with your doctor—it will travel through various systems and hands before its journey is complete. Nurses, pharmacists, insurance companies, labs, and administrative staff will all see information that many citizens deem private and intimate. Here again we see the problem of construing privacy as secrecy. Information is shared among many people within a given context. One theory22 within privacy scholarship says that when information crosses from one context into another—for example, medical information falling into nonmedical contexts, such as employment—people experience it as a privacy violation (see the section “Breakdown of Informational Contexts” later in this report). Advances in technology further complicate notions of the public and the private, and cause us to reflect more on where, when, and what is deserving of privacy protections.

It’s worth noting that the public/private split in American privacy regimes is different than the European conception of data protection, which focuses on restricting the flow of personal data rather than private or confidential data. The recently enacted General Data Protection Regulation defines personal data as:

any information relating to an identified or identifiable natural person...; an identifiable natural person is one who can be identified, directly or indirectly, in particular by reference to an identifier such as a name, an identification number, location data, an online identifier or to one or more factors specific to the physical, physiological, genetic, mental, economic, cultural or social identity of that natural person23

Much ink has been spilled in comparing the US and European approaches,24 but suffice it to say that there are pros and cons to each. They yield different outcomes, and there is much to be gained from drawing upon the best elements of both.25

It’s essential to remember that privacy costs money. That is, building information systems that incorporate strong security, user preferences, encryption, and privacy-preserving architectures requires investments of capital, time, and know-how—all things that organizations seek to maximize and conserve. It means that, when making devices and services, the preservation of privacy can never be divorced from economic considerations. Businesses must have a reason—an economic justification—for incorporating privacy into their designs: regulatory requirements, product/service differentiation, voluntary adherence to best practices, contractual obligation, and fear of brand damage among other reasons. There is also a view that managers, developers, engineers, and executives include privacy in their products because it is the right thing to do—that good stewardship of personal data is a social value worth embedding in technology. Recent research by Berkeley professors Kenneth Bamberger and Deirdre Mulligan, however, illustrates that the right thing might be driven by business perceptions of consumer expectations.26 Often, there is no easy separation of the economic and social reasons privacy architectures are built into technology, but the point is, from an engineering or compliance perspective, someone must pay for privacy.

Privacy is not just the law nor just rules to protect data sharing and storage; it’s a shifting conversation about values and norms regarding the flow of information. Laws and rules enact the values and norms we prize, but they are “carriers” of these ideas. This means, however, that the current picture of privacy rules is not the only way to protect it. The topic of the IoT affords an opportunity to reflect. How things have been need not be how they will be going forward. Research shows that people are feeling vulnerable and exposed from the introduction of new Internet technologies.27 As a wave of new devices are entering our intimate spaces, now is an excellent time to review the institutional and technical ways privacy is protected, its underlying values, and what can be done differently.

10 Solove, D. 2006. A Taxonomy of Privacy. University of Pennsylvania Law Review 154(3):477-560. Available at http://bit.ly/2d3ucsk.

11 Brandeis, L. and Warren, S. 1890. The Right to Privacy. Harvard Law Review 4(5):193-220. Available at http://bit.ly/2d3HVxI.

12 See footnote 5.

13 See footnote 5.

14 Quoted in Rouvroy, A. and Poullet, Y. 2009. The Right to Informational Self-Determination and the Value of Self-Development: Reassessing the Importance of Privacy for Democracy. In S. Gutwirth, Y. Poullet, P. De Hert, C. de Terwangne, & S. Nouwt (eds.), Reinventing Data Protection? (pp. 45-76). Dordrecht: Springer.

15 Prosser, W. 1960. Privacy. California Law Review 48(3):383-423. Available at http://bit.ly/2d3I6ZU.

16 Hornung, G. and Schnabel, C. 2009. Data Protection in Germany I: The Population Census Decision and the Right to Informational Self-Determination. Computer Law & Security Report 25(1): 84-88.

17 Gavison, R. 1980. Privacy and the Limits of the Law. Yale Law Journal 89(3):421-471. Available at http://bit.ly/2cWTFD1.

18 Schwartz, P. 2000. Internet Privacy and the State. Connecticut Law Review 32(3):815-860. Available at http://bit.ly/2dm8yxe; Simitis, S. 1987. Reviewing Privacy in an Information Society. University of Pennsylvania Law Review 135(3):707-746. Available at http://bit.ly/2dtDxYB.

19 See Part 1 of Bennett, C. and Raab, C. 2003. The Governance of Privacy: Policy Instruments in Global Perspective. Burlington: Ashgate Publishing.

20 See footnote 10.

21 For example, recent research has documented how traffic to Wikipedia articles on privacy-sensitive subjects decreased in the wake of the Snowden NSA revelations: http://bit.ly/2cwkivn.

22 Nissenbaum, H. 2010. Privacy in Context. Stanford: Stanford University Press.

23 General Data Protection Regulation, Article 4(1). Available at http://bit.ly/2ddSjoD.

24 See, e.g., Part 1 of Schwartz, P. 2008. Preemption and Privacy. Yale Law Journal 118(5):902-947. Available at http://bit.ly/2ddTYdY; Reidenberg, J. (1999). Resolving Conflicting International Data Privacy Rules in Cyberspace. Stanford Law Review 52(5):1315-71. Available at http://bit.ly/2cPKL7W; Sec. 4.6 of Waldo, J., Lin, H., and Millet, L. 2007. Engaging Privacy and Information Technology in a Digital Age. Washington, D.C.: The National Academies Press. Available at http://www.nap.edu/catalog/11896.html.

25 Rosner, G. 2015. There is room for global thinking in IoT data privacy matters. O’Reilly Media. Available at http://oreil.ly/2ddSY9y.

26 Bamberger, K. and Mulligan, D. 2015. Privacy on the Ground: Driving Corporate Behavior in the United States and Europe. Cambridge: MIT Press.

27 E.g., see the Pew Research Center’s “The state of privacy in post-Snowden America: What we learned,” available at http://pewrsr.ch/2daWMH7, and findings from the EU-funded CONSENT project, “What consumers think,” available at http://bit.ly/2dl5Uf2.

28 44 US Code § 3542.

29 Greenberg, A. 2015. Hackers Remotely Kill a Jeep on the Highway–With Me in It. Wired, 21 July. Available at http://bit.ly/2d3uCyG.

..................Content has been hidden....................

You can't read the all page of ebook, please click here login for view all page.
Reset
3.144.244.250