Policy Vacuums 

 
Download Article Article in Digital Form

IT can be molded or shaped. The term “shape” has its origin in an Old English word “scieppan,” which means create. We, as IT professionals, create numerous objects by shaping IT. A broad definition of objects, in this sense, could include software, infrastructure, an application or a network of computers. As creators of these objects, we are moral agents of the objects. The thought of moral agency is important here because as moral agents who create these objects, the values of our choice—what is right and what is wrong—will be inherited by the object itself. Creating a robot means creating the object (robot) with certain characteristics, moral or otherwise, in it. And, when we accept the notion that our creations inherit our value judgments, it is important for us to also accept the role we play as moral agents—in making permanent the values we choose, through the created objects.

As moral agents, our duty is to be well informed—that is, to seek as much valuable information as feasible to do the right thing. As producers and communicators of information, we need to, within our constraints and opportunities, be morally right. For example, we are accountable for the faithful representation of our output, we should be free from plagiarism, and we should be transparent in our communication. Independent of our roles as users or producers of information, our moral evaluations affect the informational environment (or space). For example, privacy and confidentiality of personal information and democracy in access to information are part of our broader role in the information space. Luciano Floridi suggests that when considering information ethics, we should take into account all three roles. Any consideration of only one of these is microethics; according to Floridi, what we need is an all-encompassing macroethics.1

J. H. Moor suggests that with rapid developments in technology, a new approach to ethics is required. He argues that the “logical malleability” of computers led to the so-called policy vacuums that require careful analysis to fill.2 He extends the argument to all kinds of technologies to suggest policy vacuums, for example, to the malleability of life (genetic technology), material (nanotechnology) and mind (neurotechnology). By nature, policy vacuums suggest “fighting fires” as the molded technology is embedded into a business model (e.g., Facebook) or a strategic informational object (e.g., a search engine) prematurely in relation to ethics. Thus, the approach to solving ethical dilemmas would be reactive rather than proactive, and this may produce surprises leading to frustrations and costly cleanups.

According to Moor, a three-fold approach is necessary to bridge gaps created by the policy vacuum. First, the vision of ethics as an ongoing and dynamic activity, not a post mortem, is essential. Second, much more collaboration is required at the design stage among ethicists, scientists and others. Third, a more sophisticated ethical analysis is required up front. Collectively, these factors will help reduce the policy gaps, and, over time, the number of ethical fallouts surfacing from objects of new technology would be fewer and of a less-damaging variety.

Vision of Ethics as an Ongoing Dynamic Activity

In light of the rapid change in technology, any assessment of ethical dimensions of information is a frozen picture. Any change over time can bring new ethical implications for us to identify, sort out and address. Relying on a one-time exercise, no matter how robust, would not prevent ethical missteps. Therefore, changes should be identified, documented and analyzed not just from a micro perspective (as in its impact on privacy of customer data), but rather more in the spirit of bridging the policy gap that the change creates. The purpose would be to not just capture an instance of a possible ethical compromise, but to identify and correct the systemic attributes that could potentially trigger ethical missteps.

Collaboration at the Design Stage

The earlier in the value chain we address the conflict, the better the design of informational objects. Therefore, instead of just getting the feedback on what went wrong, the tactic would be to anticipate possible ethical vulnerabilities and address them even before the information object is created. This critical change in perspective should be implemented by engaging a collaborative group of professionals. The information technologist alone may not be able to broaden the inquiry to several other dimensions, including ethics.

Here is an example. The Wall Street Journal recently reported that Google introduced a “work-around” code along with the Safari web browser to generate a scenario in which Safari would—contrary to its norm of automatically preventing installation of cookies—allow the setting up of a cookie on the user’s computer.3 And, thus, Google is believed to have caused some degree of compromise of the privacy of users. The Wall Street Journal reports that Anant Garg, who developed the work-around code, said he “didn’t consider the privacy angle.”4 Did Garg or Google fail to broaden the inquiry in designing or using the work-around?

Sophisticated Ethical Analysis up Front

Consider web application security. J. Hanny suggests that a successful application security program will be fully integrated within the system development life cycle (SDLC).5 A. D. Sanders asserts: “Consider security in every SDLC phase, from requirements gathering to operations and disposal, and threat model every software feature.”6 In contrast, it seems that we are not accustomed to integrating ethical analysis in information we produce, communicate or use. Perhaps the time for this paradigm shift is now approaching…if we wish to avoid crises.

Finally, the question: Who is responsible to fill the policy vacuums? It is a difficult question to address; however, one can surmise that the earlier the information object is in the development cycle, the wider the accountability. For example, the policy vacuums in researching the technology could be a community effort, for example, by the Electronic Frontier Foundation. Policy vacuums at the product-development or use level are likely to be the responsibility of the enterprise. A cohesive effort on all fronts by the community of organizations could lead to best practices, benchmarks and even accreditation standards in the arena of information ethics.

To quote cybernetics guru Norbert Wiener, “[T]he future offers very little hope for those who expect that our new mechanical slaves will offer us a world in which we may rest from thinking.”7

Endnotes

1 Floridi, Luciano; Information Ethics: Its Nature and Scope, in Information Technology and Moral Philosophy, Editors: J. Van den Hoven and J. Weckert, Cambridge University Press, 2008, p. 40–65
2 Moor, J. H.; Why We Need Better Ethics for Emerging Technologies, in Information Technology and Moral Philosophy, Editors: J. Van den Hoven and J. Weckert, Cambridge University Press, 2008, p. 26–39
3 The Wall Street Journal, 17 February 2012, p. A1–A2
4 Ibid., p. A2
5 Hanny, J.; “Building an Application Security Program,” Information Security Journal:  A Global Perspective, 19: 336–342, 2010
6 Sanders, A.D.; “Conficker: Lessons in Secure Software and System Design,” Information Security Journal:  A Global Perspective, 19:95–99, 2010
7 Wiener, N.; God and Golem, Inc.:  A Comment on Certain Points Where Cybernetics Impinges on Religion, MIT Press, 1964, p. 69

Vasant Raval, CISA, DBA, is a professor of accountancy at Creighton University (Omaha, Nebraska, USA). The coauthor of two books on information systems and security, his areas of teaching and research interests include information security and corporate governance. Opinions expressed in this column are his own, and not those of Creighton University. He can be reached at vraval@creighton.edu.


Enjoying this article? To read the most current ISACA Journal articles, become a member or subscribe to the Journal.

The ISACA Journal is published by ISACA. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors’ employers, or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.

© 2012 ISACA. All rights reserved.

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, MA 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.