ISACA Journal
Volume 2, 2,014 

Columns 

Information Ethics: The Piracy of Privacy 

Vasant Raval, DBA, CISA, ACMA 

Following US Federal Judge Richard Leon’s ruling that the US National Security Agency (NSA)’s bulk collection of telephone metadata is likely unconstitutional, the Wall Street Journal ran an opinion piece, that read in part:

[T]he reality of the information age is that we all have less expectation of privacy.1 No one who makes calls and emails on a smartphone, visits an e-commerce web site, uses a credit card, drives with an Easy Pass or otherwise benefits from modern technology can truly believe that he is not entrusting data to third parties about personal behavior.2

2013 was especially dotted by significant shake-ups in the field of information privacy, beginning with Edward Snowden’s disclosure of vast numbers of NSA documents. “When one considers the global uptake of information trade, it is not unreasonable to expect that informational privacy will be to this century what liberty was in the time of John Stuart Mill.”3 More recent happenings include German Chancellor Angela Merkel’s complaint that the US government tapped her cell phone, an open letter from certain tech companies to US President Barack Obama suggesting a review of NSA’s data surveillance programs,4 the appearance of the NSA director before a congressional committee, CBS News’ 60 Minutes coverage of the NSA story,5 and previously mentioned Judge Leon’s ruling that the NSA practices regarding telephony metadata are likely unconstitutional.6, 7

Nature of Data—A Privacy Perspective

Intent in furnishing data is central to the issue of privacy. The question is: What purpose on hand drives the need to share data? Our consent, or agreeing to provide data, is the first layer of protection of privacy. Accordingly, data can be classified as transactional (driven by contract) or voluntary (driven by consent of the provider). Contractual sharing of data is now well established and expectations of privacy surrounding such data have matured over time. What is causing uninformed debate about privacy is the nature and context of voluntarily supplied personal information.

The growth in voluntarily supplied information has been, and will continue to be, exponential. Information infrastructures (e.g., Google, Facebook, Twitter, LinkedIn) are emerging daily and becoming popular in a short period of time. In our desire to seek services of various kinds—for efficiency, productivity, profit or just fun—we sort of agree to terms and conditions that we may not have even read or, if read, not comprehended fully. One might say that this, too, is contractual sharing of data. Yes, while that is true, the contract refers to no direct exchange for consideration between the company and the patron, but rather for the use of the infrastructure, which results in trillions of data—data that the owner of the infrastructure controls, aggregates and uses for its own economic gain.

Transactional data have a definite cycle; voluntary data seem to fade into perpetuity. The intentions behind the former are usually tacitly articulated and apply within the realm of the agreeing parties. In contrast, voluntary data are generally timeless, could be “sliced and diced” using data mining, and can be further masked and shared for economic gain of the infrastructure owner and its customers.

Soft Volunteerism

Soft volunteerism refers to the act of volunteering information on the part of a user when, in fact, one does not necessarily want or mean to do so. It is not so much consent to share data, but rather lack of dissent in sharing data. Passivity or inertia on the part of the provider plays an important role in one’s attraction to soft volunteerism. Immediate perceived benefits of seeking the services and, thus, benefiting from them outweigh anything that the user vaguely understands as the costs of doing so under the terms and conditions signed by the user.

Before clicking the “I agree” button on an agreement to use, how often have you paused and analyzed the contents of the agreement? The agreement is generally long, filled with legalese and you feel like you are wasting time in getting to the services provided by the company that just popped the agreement on your screen. According to the prospect theory of decision-making behavior, losses are weighted more heavily than gains.8 And here you are, delaying the immediate benefits of using digital signatures from Co-Sign or the latest version of Adobe Acrobat. And we fall into the vulnerability of allowing an apparently harmless agreement to come in the way of doing want we want to do.

According to a 2009 survey conducted by Harris Interactive Inc., on behalf of Microsoft and the National Cyber Security Alliance, about one out of five people has given personal information on a web site that may not have been a secure site without knowing if someone was tracking the personal information or login information as it was typed.9 People willingly share personal information when they are nudged to do so. The perceived benefits seem to outweigh any remotely noticed costs of volunteering the information.

Over time, as the avalanche of such actions continues to precipitate, it becomes second nature to experiment with the excess of arguably free services. We may still adorn the normative value of privacy, but privacy, in fact, continues to decline. We voluntarily place into our cars wireless devices from our auto insurance company that track our driving habits, and we walk around with an active cell phone that announces our position, time and potential interests.

What Drives Soft Volunteerism?

Claiming to know our fallibilities better than we do, soft paternalists seek to aid us in making “correct” decisions through persuasion; they guide us toward the alternatives that we would have chosen had we been exercising willpower and foresight.10 Companies and institutions merely organize the context in which people make decisions. A form of social engineering of people’s decision making toward a particular outcome, it preserves the possibility of choice. Indeed, it grants the choice, but with an overwhelming urge to lean toward the recommended choice, nudging people toward more disclosure while preserving the illusion of free choice.

In part, this apparently seamless data gathering process is fully enveloped within a technological framework. As part of our daily life, the process has little visibility, if any. Its ubiquitous nature makes it a part of our routine; it is easy to neither control nor challenge its presence. Soft volunteerism is sourced in mandatory volunteerism and, as defined by Gary Marx, is disingenuous communication that seeks to create the impression that one is volunteering when that really is not the case.11 Here are some examples:

  • Retail store loyalty card
  • Ownership and use of mobile electronic devices
  • Credit card ownership and use

Even normal inquiries with a provider begin with: “This call may be monitored for quality assurance purposes.” Of course, we are not going to hang up; our focus is on seeking a resolution for the question hovering in our mind. We do not say, “Well, I do not agree to the recording of this conversation.” And of course, the company is nudging us toward a complacent “OK.” Absence of dissent overtakes the explicit presence of consent.

Four Premises

In searching for an appropriate threshold for consent in privacy, Ian Kerr and colleagues focus on the following four requirements:12

  1. The subject must have knowingly consented to the collection, use or disclosure of personal information.
  2. Consent must be obtained in a meaningful way, generally requiring that organizations communicate the purposes for collection, so that the person will reasonably know and understand how the information will be collected, used or disclosed.
  3. Organizations must create a higher threshold for consent by contemplating different forms of consent depending on the nature of information and its sensitivity.
  4. In a principal-agent relationship, consent is dynamic and ongoing. It is implied all the time that the patron grants the privilege to the provider and is only good as long as the consent is not withdrawn.

As Kerr and his colleagues point out, a legal definition of consent is hard to find. The common law context suggests that consent is a “freely given agreement.” An agreement, contractual or by choice, implies a particular aim or object.

While it is clear that the force of laws and regulations is necessary, in the end, what equally matters is the behavior of the user. Concepts and paradigms such as bounded rationality and prospect theory point to the vulnerability of human users in exercising consent. If that is where the failure occurs, privacy issues will only propagate, not get better.

Finally, remember that privacy solutions embedded in the technology to empower users to protect their privacy are only as good as the motivation, knowledge and determination of the user. Not all users are equally technology savvy; not all users consider it worth their time to navigate through privacy monitors in a gadget to feel safe. And generally, all users are creatures of bounded rationality. Despite Apple and Google, for example, giving them more tools to control on their own, not much difference on the privacy landscape is likely.

Soft Volunteerism to Soft Surveillance

Soft volunteerism nudges people to share more information. This results in a huge pool of data across companies and institutions. If hard surveillance, such as the use of camera watching over a parking lot, remains concretely vivid, soft surveillance remains buried in the technology, allowing it to explore freely on available data and metadata. The NSA’s reach of networks could become wider and stronger. Consequently, people would lose trust in their providers, such as Facebook, and the loss of trust would translate into a lack of prosperity for the provider. This is where the motivation for the open letter by technology companies to President Obama can be contextualized. Technology companies want to protect their collection of voluntary data.

Going Forward:  Is Privacy a Mirage?

I wonder:  Is privacy now a mirage? With volumes of data emerging, work and home boundaries fading, and devices becoming more pervasive, is a viable defense of privacy possible? Or, is the situation going to continue to worsen on this ground? Because companies like Facebook thrive on data that users provide for safe storage and proper use, is it likely that the Facebooks of the world will fight for personal privacy and innovative solutions to protect privacy? For now, it seems like a business model that simply focuses on privacy protection may be inadequate for the financial world.

Endnotes

1 According to a Pew research survey, 59 percent of Internet users do not believe it is possible to be completely anonymous online, while 37 percent of them believe it is possible. See Pew Research Center, “Anonymity, Privacy, and Security Online,” Pew Research Center’s Internet & American Life Project survey, September 2013.
2 The Wall Street Journal, “The Judge and the NSA,” 18 December 2013, p. A16
3 Kerr, I.; J. Barrigar; J. Burkell; K. Black; “Soft Surveillance, Hard Consent: The Law and Psychology of Engineering Consent,” Lessons From the Identity Trail: Anonymity, Privacy and Identity in a Networked Society, Editors: I. Kerr; V. Steeves; C. Lucock; Oxford University Press ebrary, p. 8
4 Wyatt, Edward; Claire Cain Miller; “Tech Giants Issue Call for Limits on Government Surveillance of Users,” The New York Times, 9 December 2013, www.nytimes.com/2013/12/09/technology/tech-giants-issue-call-for-limits-on-government-surveillance-of-users.html
5 60 Minutes, “NSA Speaks Out on Snowden, Spying,” CBS News, 15 December 2013, www.cbsnews.com/news/nsa-speaks-out-on-snowden-spying
6 Nakashima, Ellen; Ann E. Marimow; “Judge: NSA’s Collecting of Phone Records Is Probably Unconstitutional,” The Washington Post, 16 December 2013, www.washingtonpost.com/national/judge-nsas-collecting-of-phone-records-is-likely-unconstitutional/2013/12/16/6e098eda-6688-11e3-a0b9-249bbb34602c_story.html, Dec. 16, 2013.
7 Subsequently, in a separate hearing of a petition filed by the American Civil Liberities Union, US District Judge William Pauley upheld the legality of the NSA’s data-collection program. The Wall Street Journal, 28 December 2013
8 Kahneman, D.; A. Tversky; “Prospect Theory: An Analysis of Decision Under Risk,” Econometrica, 47(2), March 1979, p. 263-291
9 Microsoft, Browser Security and Privacy Fact Sheet, March 2009, www.microsoft.com/presspass/newsroom/windows/factsheets/03-18BrowserSecurityFS.mspx
10 The Economist, “Soft Paternalism: The State Is Looking After You,” 6 April 2006, www.economist.com/node/6772346
11 Marx, Gary; “Soft Surveillance: The Growth of Mandatory Volunteerism in Collecting Personal Information,” Surveillance and Security: Technological Politics and Power in Everyday Life, Editor: T. Monahan, Routledge, UK, 2006, p. 37
12 Op cit, Kerr. Most of the analysis presented by Kerr, et al., is based on the Canadian Personal Information Protection and Electronic Documents Act (PIPEDA).

Vasant Raval, DBA, CISA, ACMA, is a professor of accountancy at Creighton University (Omaha, Nebraska, USA). The coauthor of two books on information systems and security, his areas of teaching and research interests include information security and corporate governance. Opinions expressed in this column are his own and not those of Creighton University. He can be reached at vraval@creighton.edu.

 

Add Comments

Recent Comments

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and from opinions endorsed by authors’ employers or the editors of the Journal. The ISACA Journal does not attest to the originality of authors’ content.