ISACA Journal
Volume 2, 2,014 

Features 

Why Computer Ethics Matters to Computer Auditing 

Wanbil W. Lee, DBA 

Unethical use of the computer1 is harmful to all—perpetrators and victims—and more than likely backfires on those who make use of computer facilities for unethical exploits. Failing to identify ethical issues surrounding the computer or neglecting these issues runs the risk of adverse consequences, and, especially in the case of computer auditing, it means a dangerously incomplete audit decision.

“Hardware, software and people are all sources of difficulties; people [the computer users] who appear to be cognizant of the risks involved [being] the most troublesome.”2 It is fair to say that people are the major source of risk and any decision to abuse or not to abuse depends entirely on a person’s sense of morality. Furthermore, “People need something to awaken their sense of responsibility because information regarding computing risks and safe practices does little to significantly change behavior.”3 Computer ethics is that “something,” and the connection between computers and ethics is obvious.

Computer auditing aims to discourage, detect and prevent information risk by enforcing compliance and governance rules and standards, whereas computer ethics attempts to minimize this risk by warning people of the potential adverse consequences of unethical actions. This integrated aim in risk aversion unites computer ethics and computers.

An audit decision without ethical consideration will lead to intangible, adverse consequences in the short and long term. Information systems (IS) auditors should regard the neglect of ethical issues associated with the use of information systems as a different type of risk that could be averted through a daily, routine check—a different kind of antirisk mechanism (compared with the other established antirisk routine countermeasures, e.g., password, cryptographic algorithms, antivirus software). IS auditors could also make use of the concept behind computer ethics to perform a dual function: a change agent function that can persuade rectifying behaviors/attitudes and an incubator function that can cultivate trusting relationships with the clients and maintaining a professional reputation in the industry.

Connecting the Computer and Ethics

Computer ethics has been given different labels including cyberethics, information ethics and, one of the more noticeable recent terms, sociotechnical computer ethics. The latter is attributed to the three recommendations in response to the three mistakes of science, technology and society.4 In this article, “computer ethics” is chosen for the reason that the core issues involved are rooted in the computer no matter how sophisticated and complicated the modern facilities, e.g., the Internet, social media.

Like ethics, computer ethics may mean different things to different people. In one often quoted definition, “Computer ethics is the analysis of the nature and social impact of information and communication technology, and the corresponding formulation and justification of policies for the ethical use of such technology.”5 In this article, it is taken to mean ethics in cyberspace.6 This simple definition implies that computer ethics is concerned with issues of ethical consequences arising out of using the computer and its peripherals, particularly the Internet, and that, while many of the threats are not possible without the computer, the computer becomes the culprit (or an instrument) and the victim (or a target).

The uniqueness of this relatively new field has been hotly debated and widely reported in literature.7 The uniqueness camp argues that things such as software, web sites and online video games never existed before the arrival of the computer and these new things created new ethical dilemmas that are beyond the existing ethical rules or standards and justify the creation of a branch of applied ethics.

Connecting Computer Ethics and Computer Auditing

Of the several implications that computer ethics has for computer auditing, computer ethics supplies the rules and regulations derived from the ethical principles and theories in an attempt to prevent/minimize abuse in cyberspace, while computer auditing primarily offers preventive and, to some extent, detective techniques, aimed at ensuring compliance with rules and regulation governance. This shows the common interest and different objective between computer ethics and computer auditing.

Moreover, failing to identify or excluding the ethical issues surrounding the computer (when performing risk assessment), despite sound cost and benefit justification, is a different type of risk (vis-a-vis the risk arising out of access control malfunction, identity theft and so on) as it will lead to potentially adverse consequences, such as loss of trust or collapse of business. For example, in assessing softlifting (illegal copying of software for personal use) or unauthorized access to sensitive information, the IS auditor should consider the damage due to personal use of sensitive information, not just concentrate on technical access control mechanisms. Considering the consequence in total may yield a different evaluation of the damage. This difference may be attributed to the exclusion of the cost for losing the confidence of the data owners (internal and external clients) and impairing the provider-client relationship, as well as payment for compensation and expenses for legal proceedings. Ethics should, therefore, be taken as an additional type of risk that IS auditors should check every day. Checking for potential ethical issues by performing an ethical analysis, for example, may reveal risk that may lead to ethical consequences, but may otherwise be missed in the traditional antirisk checks and audits. The fact that an ethical analysis adds a step or steps to the routine makes computer ethics a different kind of antirisk mechanism as alluded to earlier.

The variety, frequency and incidence of abuses are still increasing, despite the extant antirisk mechanisms, computer laws, and even computer auditing and information governance. This increase in abuse arguably can be attributed to several major reasons. The first is that computer-related risk is, as alluded to earlier, treated as a technical issue, missing out on its managerial, sociotechnical aspects. The second is that new tools will be rendered impotent by their inherent limitations, external factors, unexpected changes and the ever-lurking predators, no sooner than these new tools become available.8, 9 The third is that the law is useful as a reference framework for remedy and can be a forceful deterrent to prevent abusive acts by virtue of its power to punish, but it is by nature too slow to combat the rapidly developing acts of wrongdoing because creating new laws is a complex and lengthy process and because people are, in general, reluctant to proceed with legal action or they opt for legal action only as a last resort. The last is that the audit-based mechanism is limited as a deterrent, the audit tools are empowered to detect deviation from a set policy of performance, and the audit tools check and verify that compliance is properly enforced and the results are consistent with the set standard, but in physical, cost-benefit terms only.

Figure 1As revealed in an exploratory study,10 which ran an annual survey from 2006 to 2012, knowledge of ethics tends to yield an effect of reducing people’s inclination toward abusive behavior, thus a reduction in the new type of risk (or threats to abuse the development and use of information). The participants were part-time, final-year students for an undergraduate award in computing at The Hong Kong Polytechnic University, aged from early 20s to early 40s in regular full-time employment in the computer industry, with two to 15 years of experience ranging from technical to managerial positions. The survey was carried out at the beginning and at the end of a compulsory course that included the topics of ethics and professionalism, using the same questionnaire. Three questions to which a yes/no indication was sought were set for the beginning of the course, and five for the end of the course. The annual number of participants over the entire period of the survey averaged 95 students. The responses to each question (summarized in figure 1) have been consistent over the seven consecutive years (p > .05) and indicate that less than 10 percent of students are aware of computer ethics. Further, more than 60 percent claimed that they were not sure if they carried out their work ethically and, conversely, about 30 percent claimed that they thought they carried out their work ethically.

Computer ethics aims to cultivate good behavior in cyberspace and image and reputation depend on proven fairness (in conducting business and personal affairs) or confidence (in us by others). There will be no fairness or confidence unless people consistently demonstrate honoring promises and stand by the consequences of their words and deeds by closely adhering to ethical principles, and so on.

It is a common interest and a set of different objectives that connects computer ethics and computer auditing. As a soft, humanistic methodology based on ethical principles, computer ethics aims to supplement computer auditing.11

Ethical Analyses and Decision

An ethical decision is simply a binary choice of right or wrong, or good or bad. It can, however, be complex because of the inevitable bias, prejudice and subjectivity of the decision make, as he/she is never free from interfering and competing factors, nor is any decision maker free of value judgment. To guide decision makers through the labyrinth of often conflicting ethical views, a unified model was contemplated and developed. Most notable are the menu-like checklists including the four-step approach,12 the Five-step Principled Reasoning,13 the Seven-step Path for Making Ethical Decisions14 and the Ethical Matrix.15

Briefly, the checklists cover more or less the same ground and guide users through the predetermined steps, with the major steps being:

  • Analyze the situation.
  • Make a defensible ethical decision.
  • Describe steps to resolve the situation.
  • Prepare policies and strategies to prevent recurrence.

The ethical matrix was originally designed for making ethical decisions in the field of food and agriculture and, since its release, has been adapted for other fields, for example, business16 and information engineering.17 The matrix is made up of three columns and as many rows as the particular case needs. A row is allocated to a stakeholder; the three columns correspond to three common ethical principles—well-being, autonomy and fairness—and the cells contain the concerns of the stakeholders (the main criterion that should be met with respect to a particular principle).

The following case illustrates the matrix method in action: A high-tech facilities distributor replaced its existing offline help-desk platform with an online monitoring facility, and the new system enables help-desk staff to see exactly what is on the users’ screens and respond to users’ requests for help quickly. The fast response time and elimination of the help-desk staff’s traveling time (to reach the users) has increased user satisfaction and operational efficiency. The executive vice president (EVP) is impressed, particularly with the surveillance function, and asked the chief information officer (CIO) to have a copy of the system installed in her office as she has been searching for a tool to deal with drug dealing allegedly occurring on company premises. The first-cut analysis identified four interest groups: The firm, staff, the EVP and the CIO, and reveals two major sets of concerns of each interest group with respect to the ethical principles (summarized in figure 2):

  1. Watching (by the help-desk staff) over the users’ screens without the users knowing it has the potential to violate personal privacy in the work place. This certainly alarms the staff (for fear of privacy invasion) and the firm (for concern over corporate image and personnel welfare).
  2. Entertaining the EVP’s request implies using the surveillance function for spying on the staff and deviates from the original, approved purpose for acquiring the new system. This disturbs the CIO (with respect to professionalism and deontological issues), the EVP (on her duty to protect the corporate image from possible damage), the staff (about being exploited by one or a few rotten apples) and the firm (about staff morale and company policy).

Figure 2

Conclusion

Some unethical actions, whether unintentional or otherwise, can get one into trouble with the law, while other unethical actions, though legal, can ruin reputations, even careers. Furthermore, some unethical actions can be carried out faster with the computer while other unethical actions may have been difficult or infeasible without the computer, but are now possible. As the usage of computers proliferates, individuals and organizations are becoming increasingly vulnerable.

The more professionals, including IS auditors, appreciate ethical principles, the more they are capable of discerning the good from the bad. As a result, they will be able to improve their chances of arriving at a desirable and defensible case, for example, against allegations of being unprofessional. This will help them do the right thing in cyberspace,18 build up their professional reputation, win the trust19 of their clients.

Reflecting on the ethical analysis and decision-making models, the menu-like checklists provide a well-structured, user-friendly list to guide users through a series of structured steps in analyzing the dilemmas with respect to the explicitly specified theories/principles and, in the end, taking certain actions or answering certain questions. The ethical matrix, though designed for making ethical decisions in the field of food and agriculture, can be adopted for other fields with appropriate adjustment.

Risk is not only a tangible, but also a sociotechnical issue—not only a technical task, but also a managerial concern. Hence, cost-benefit analysis (which is based on bottomline, economic reasoning) and risk analysis (which relies on probabilistic, computational arguments) are inadequate. IS auditors are alerted to a tripartite analysis model, a new decision analysis paradigm that comprises cost-benefit, risk and ethical analyses.

Endnotes

1 “Computer” is used in this article to encompass the computer-based information system and its associated hardware, software, networking, enabling facilities and people.
2 Neumann, P. G.; Computer Related Risk, ACM Press/Addison-Wesley, USA, 1995
3 Lee, Wanbil W.; Keith C. C. Chan; “Computer Ethics: A Potent Weapon for Information Security Management,” Information Systems Control Journal, JournalOnline, vol. 6, December 2008, www.isaca.org/journalonline
4 Johnson, D. G.; Computer Ethics, 4th Edition, Prentice Hall, 2009
5 Moor, J. H.; “What Is Computer Ethics?,” Metaphilosophy, vol. 16, no. 4, October 1985, p. 266-275
6 The Computer Ethics Society, iEthics, www.iEthicsSoc.org
7 Op cit, Johnson
8 Hitchings, J.; “Deficiencies of the Traditional Approach to Information Security and the Requirements for a New Methodology,” Computer & Security, vol. 14, no. 5, 1995, p. 377-383
9 Lee, Wanbil W.; Information Security Management: Semi-intelligent Risk-analytic Audit, VDM Verlag, 2010
10 Op cit, Lee and Chan, 2008
11 Op cit, Lee, 2010
12 Kallman, E. A.; J. P. Grillo; Ethical Decision Making and Information Technology, McGraw-Hill, 1996
13 Josephson Institute of Ethics, “Five Steps of Principled Reasoning,” 1999, www.ethicsscoreboard.com/rb_5step.html
14 Josephson Institute of Ethics, “Making Ethical Decisions: A Seven-step Path 20,” 2002, http://blink.ucsd.edu/finance/accountability/ethics/path.html
15 Mepham, B.; M. Kaiser; E. Thorstensen; S. Tomkins; K. Millar; Ethical Matrix Manual, LEI, The Hague, 2006
16 Fejfar, Anthony J.; Social Responsibility of Business, Business Ethics, and the Ethical Matrix, 2006, www.lulu.com
17 Lee, W. W.; Becky Shiu; “Ethics Conducive to a Positive Image of the Information Engineer,” The Journal of the Hong Kong Institution of Engineers, vol. 40, December 2012, p. 13 and 15, http://iEthicsSoc.org/publications.html
18 Op cit, Johnson
19 Schneier, Bruce; Liars and Outliers: Enabling the Trust That Society Needs to Thrive, John Wiley & Sons, 2012

Wanbil W. Lee, DBA, is principal director of Wanbil & Associates, president of The Computer Ethics Society and adjunct professor in computing at the Hong Kong Polytechnic University (Kowloon, Hong Kong). He has devoted more than five decades to serving the field of computing in the banking, government and academic sectors through various roles in Austrasia and Hong Kong. His current teaching and research interests focus on ethical computing and information security. He is a member of several learned societies and sits on committees/boards of some of these bodies, editorial boards, and advisory committees of the Hong Kong government.

 

Add Comments

Recent Comments

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and from opinions endorsed by authors’ employers or the editors of the Journal. The ISACA Journal does not attest to the originality of authors’ content.