Vasant Raval, DBA, CISA, ACMA
Ethics is a practical social activity, not a utopian concept to be contemplated in the abstract.1 In this column, we take a hard look at the realities of information ethics programs and ponder over the question: Can such a thing be effective and, if it can be, under what conditions? Theories and paradigms help us get ready to take action, but practical ways to detect, manage and attempt to prevent ethical lapses are important to the real world. Despite their pervasiveness, we know little about how to manage and even survive the aftermath of ethical failures.2
If there is one lesson from history that provides an important beginning, it is this: Unethical behavior happens; what is undetermined is when and by whom, not if. And this is despite good intentions on most everyone’s part, including the one who is seen as the violator of our trust who believes that he has done no wrong.
Just recently, a sudden media surge focused on Edward Snowden, who leaked documents from US National Security Agency (NSA) programs to make the point that US citizens’ privacy is being encroached upon by their government and that their democratic rights are at risk. He is convinced he did nothing wrong, while the US Department of Justice claims that he performed a criminal act by leaking secret documents, breaking a condition of his employment. Many themes emerge from this story: privacy, national security risk and monitoring such risk factors, freedom of speech, inner conviction of what is right and what is wrong, whistleblowers perceived as heroes or criminals, and breaking the law. How do you sort all this out if you are the head of the NSA?
A first realization to be considered is that it is impractical to divide the world into good people and bad people and use it to develop tactics to generate appropriate behavior. At times, ordinary, decent individuals exercise indiscretion, and those who habitually violate ethical precepts may surprise us with remarkably humane deeds. Just about anyone could end up in an indiscretion; to minimize ethical misconduct, groups—formal and informal—could do more to lay out the rules that, when obeyed, will result in trust in others. Forceful behaviors emerge from strong ties with an organization (e.g., neighborhoods, communities, societies, businesses). Therefore, we focus on those drivers that help people embrace and adopt group norms.
Another way to look at this is from the perspective of self-interest of a member of the community or an employee. Self-interest can often be a strong motivation for individuals to get ahead and materially prosper quickly and at any cost. When people focus on self-interest at the cost of group interest, lapses occur that harm the well-being of the group. In practice, group norms of governance invariably focus on helping members of the group to be aware of, and control, their self-interests.
Remember the story of the programmer, Sergey Aleynikov, who quit a US $1.2 million-a-year job at Goldman Sachs for greater riches in 2009? Before he left Goldman, he uploaded the firm’s high-frequency trading code—the secret sauce—to a server overseas and then downloaded it on to a pen drive with a view to replicate it at his new employer.3 Temptations resulting from self-interest could cause havoc in the IT space; Goldman’s intellectual property and substantial future revenues from it were at risk—all because of one person’s self-interest.
Beyond any doubt, the environment of the entity (i.e., workplace, family) takes the color and spirit of the leader. No code of ethics will work unless the environment—the tone at the top—supports it unconditionally. People learn vicariously and do what you do, not what you say. Since the culture of the entity overlaps with the character of its leader, it is hard to separate the two, for the tone at the top is just a derivative of the leader’s moral cognition. If the followers do not see integrity in their leader, it is likely that they will not take the written word seriously. A self-sacrificing leader has a better chance of making the environment drive ethical behavior than a self-interested leader.4
Honesty and integrity are pillars of the ethical climate. Without honesty, stakeholders doubt that they are engaging with the leadership in an open discussion—that there is transparency in communication. To reinforce honesty, leader behavior consistent with communication is key to creating an effective ethical climate. Managers not acting according to the code either introduce noise into the communication process or provide direct information that a convention does not really exist.5
When the tone at the top reinforces the code and related rules by force of day-to-day behavior, these become a convention, a custom or common law of the organization. Stakeholders follow the code because it is customary, expected and well-known to those who are responsible to abide by it. To make the code and related rules a strong custom, their nature and importance, including the consequences of failure to act accordingly, must be routinely and clearly communicated. A convention formed in this way can be quite effective in motivating people to follow the set norms.
How a convention takes root can be seen by examining the elements of convention. These elements include the utility of justice, conditional motives, the usual force of passions, intelligibility, moral approbation and language.6 The utility of justice emphasizes the influence of how others are acting. Others’ behavior reinforces expected behavior and, thus, in a collective sense, creates the environment of predictability and trust. Of course, any single individual’s act is not enough; the assumption is that everyone exhibits desired behaviors, i.e., cooperates for the good of the collective. In cooperating with others, there is the force of self-interest—in the sense that you and your possessions are protected from violators. No harm is normally expected because the convention draws all actors to respect others’ rights—their duties—to hold the society in a predictable balance. Intelligibility has to do with the recognition of what others are doing, much like creating a brand reputation that implicitly declares the traits of the brand. Once intelligible, the force of convention influences the collective and makes everyone want to be part of the convention. When a (summary) rule becomes a practice rule, members of the society are informed to not violate the rule when working on their self-interest. Moral approbation imposes consequences of indiscretion on violators of rules. Finally, for the transition from paradigms to practice, the convention should become part of the common language of the entity. Without this transition, the convention remains opaque and may be subject to misinterpretation.
When good people behave in pathological ways that are alien to their nature, they are suffering from ethical blindness.7 Ethical blindness suggests the temporary inability of a decision maker to see the ethical dimension of a decision at stake. It is assumed here that people deviate from their own values and principles. Ethical blindness is context-bound and, thus, a temporary state. Ethical blindness is unconscious; the person suffering such blindness cannot access or does not use those values when making a decision.8 Doing the right thing begins with an awareness that an ethical question exists and this presupposes that the person is not ethically blind at the time.
Awareness of an ethical dilemma is the initial stage in which the questions of right and wrong first emerge. If you are not aware, you cannot recognize the problem and, therefore, you cannot address the problem. However, just because you are aware of the dilemma, it does not automatically mean that you will arrive at the right behavior. Moral awareness is a rational process that allows the person facing the dilemma to interpret it in a conscious manner. Unfortunately, people suffer from bounded rationality (bounded ethicality). Individuals do not see the moral components of an ethical decision, not because they are morally uneducated, but because people are cognitively limited and cannot make perfect and accurate decisions.9 A person facing an ethical dilemma should have the commitment to ethical conduct and should be able and willing to wade through the process of doing the right thing. This is where the leadership of the organization and the tone at the top come into play. Moral commitment should be motivated by strong ethical leadership, include constant communication of the convention and relevant examples related to the convention, and comprise leadership’s willingness to provide help—all these aspects clearly play a part in generating the right behavior by members of the organization.
As the NSA story suggests, restoring trust after an ethical lapse can be a nightmare. Not doing anything is not an option; immediate and swift actions are necessary to restore trust, reinforce the convention and communicate consequences of the violation. People learn from concrete examples of what happened and how the leadership dealt with the wrongdoing. Compliance, if not enforced, will marginalize the convention and make people interpret the code according to their beliefs. Although compliance is often no more than a corrective action and cannot be taken without detecting the wrong in the first place, it still serves the important role of laying down consequences and confirming the rules of behavior.
Technological changes bring new risk and reward. For example, bring your own device (BYOD) was not a significant question 10 years ago; today, it is an opportunity that comes with myriad risk factors. Nanotechnology, big data and virtual currencies are streaming in a whole host of questions, both technology- and business-related. Business models are changing at an unprecedented pace.
Change implies progress in tandem with new uncertainties. For this to result in a net positive, uncertainties need to be identified and harnessed while leveraging the obvious benefits of the change. Because of its proximity to the change, the entity that experiments with a new technology is morally responsible to lead the search for guidance on desired behavior. For example, enterprises such as Google and Facebook should set the tone as exemplars on the issues of privacy, IBM and Amazon should help develop benchmarks for cloud services, and Apple and Samsung should show the way to right conduct in the deployment of mobile devices. If those at the frontier who are fortunate to experience the new environment will not lead, regulators will likely step in.10 Since regulators may know little about the change and its consequences, the rules of compliance may end up as counterproductive in large measure.
Some technological changes can be monitored and evaluated by a group of firms instead of a single corporation. The problems evident in the use of virtual currency (e.g., the case of Liberty Reserve) can be avoided if the players in the world of virtual currency unite and at least initially agree to self-regulate. Firms in the financial services industry know about programmed, fast trading and should lead a search for moral guidance. Nanotechnology start-ups should be responsible to develop guidelines for their industry for they know more about their products, processes and the impact on norms of behavior for the common good.
Regardless of the locus of responsibility and leadership in the search for answers, two considerations are important. First, the entire value chain—from product research to after-market—should be carefully examined to ensure reasonable completeness of solutions proposed. Second, for a systematic search, even in the case of mature technologies, new deployments of existing technologies, or technologies not previously explored, frameworks such as COBIT 5 can prove useful for relatively stable and fruitful solutions.
1 Bird, F.B.; The Muted Conscience: Moral Silence and the Practice of Ethics in Business, Greenwood Publishing, 20022 De Cremer, D.; A.E. Tenbrunsel; M. van Dijke; “Regulating Ethical Failures: Insights From Psychology,” Journal of Business Ethics, 95:1-6, 20103 Albergotti, R.; “Questions Linger in Goldman Code Case,” Wall Street Journal, 14 June 2013, p. C14 Mulder, Laetitia B.; Rob M. A. Nelissen; “When Rules Really Make a Difference: The Effect of Cooperation Rules and Self-sacrificing Leadership on Moral Norms in Social Dilemmas,” Journal of Business Ethics, 95:57-72, September 20105 Kline, William; “Hume’s Theory of Business Ethics Revisited,” Journal of Business Ethics, 105:163-174, 20126 Ibid. Derived from Humean ethical precepts, Kline discusses these elements in depth.7 Zimbardo, P.; The Lucifer Effect—Understanding How Good People Turn Evil, Random House, USA, 20078 Palazzo, G.; F. Krings; U. Hoffrage; “Ethical Blindness,” Journal of Business Ethics, 109: 323-338, 20129 De Cremer, D.; A.E. Tenbrunsel; M. van Dijke; “Regulating Ethical Failures: Insights From Psychology,” Journal of Business Ethics, 95:1-6, 201010 Linton J. D.; S. D. Walsh; “Introduction to the Field of Nanotechnology Ethics and Policy,” Journal of Business Ethics, 109:547-549, 2012
Vasant Raval, DBA, CISA, ACMA, is a professor of accountancy at Creighton University (Omaha, Nebraska, USA). The coauthor of two books on information systems and security, his areas of teaching and research interests include information security and corporate governance. Opinions expressed in this column are his own and not those of Creighton University. He can be reached at [email protected].
Enjoying this article? To read the most current ISACA Journal articles, become a member or subscribe to the Journal.
The ISACA Journal is published by ISACA. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.
Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors’ employers, or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.
© 2013 ISACA. All rights reserved.
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, MA 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.