Character Traits of an IT Professional 

Download Article Article in Digital Form

In this issue’s column, we focus on the individual professional and strive to describe him/her in terms of virtuous traits that help in his/her professional role. It is beyond debate that a professional is not just an aggregation of cognitive knowledge and skills; a professional is a more holistic person with many more traits. While we have some knowledge of how organizations foster ethical practices (e.g., codes of ethics),1 we have not yet attempted in this column to take a peek at the individual professional: What would it take for an IT professional to be virtuous in his role? This question is too complex to explore fully in a column, but an attempt can be made to gain a basic understanding of this landscape. Admittedly, we aim to generate more thought; this is not a thorough treatise on virtue ethics of an IT professional at work.

Since we are now talking about individuals, not organizations, we can bring into our discussion not just the externalities of the world of work and its rules of behavior, but rather a broad array of who we are, our inner being, our traits that play a key role in defining our norms of behavior at work.

Virtue ethics focuses on the moral character of the person in contrast to the emphasis on rules (in deontology) or the consequences of action (in utilitarianism). The term “virtue” is derived from the Latin word virtus, which over time took the meaning of a description of Roman virtue. Virtue ethics encompasses three broad spheres: virtue, an expression of a stable disposition or character; practical wisdom or prudence exercised by an individual; and eudaimonia, the sense of well-being or flourishing. Our interest is in examining the second sphere, prudence, the origin of role-based traits.


A person’s disposition is his/her underlying constitution, which bears upon the individual’s decisions at all levels: physical, mental, moral and spiritual, for example. The disposition cannot be turned on or off; it influences all aspects of life—both private and at work. Disposition is permanent (e.g., glass is fragile), while an occurrence exhibiting the disposition (glass, upon impact of a stone, is shattered) is occasional.2 You do not see a habitual smoker smoking all the time. On the other hand, a single observed action is, by itself, insufficient to conclude anything about the character of the person. Finally, disposition is not a habit; it is the source that drives actions from within. Proponents of virtue ethics would argue that the moral education that molds one’s disposition is more important than inculcation of rules, for rules may not have the requisite variety or complete context within which exceptions may be made. This broadening of norms of behavior is often considered as spanning the area of morality from a somewhat bounded definition of ethics.3

A person’s character establishes a reason for action beyond any rules. An honest person is wired to practice nothing but honesty, even in acts where rules are vague or absent; where no one is looking; and where no laws, regulations or policies are violated.


A virtue ethics framework for a professional includes meta-virtues (moral or scientific virtues) as well as role-constituted (prudential) traits. Meta-virtues (e.g., integrity) are truly the foundation of one’s character. However, their explication in terms of the practice of a professional produces specific character traits, called role-constituted traits (e.g., integrity expressed in the level of transparency in communication). Thus, meta-virtues not only enhance the good of the professional practice, but also help delineate other practical traits, i.e., traits that are practically visible in action. Such role-constituted or prudential traits emerge from what is called phronesis, or the virtue of practical thought, prudence or wisdom. Any prudent behavior warrants the grasp of particulars (or the context), cleverness in dealing with the situation, insights beyond the surface and a deeper understanding of the consequences of one’s action. The absence of prudence may not make one unethical, but would in all likelihood make one ineffective. A professional wants to be ethical and effective. Preserving the privacy of information, for example, does not mean blocking private data from all eyes. The latter is a drastic solution void of prudence.

So, the next question is: Where can I find a list of role-based traits pertinent to my professional role? This would help me see what traits are warranted in me as an IT professional. According to Shannon Vallor, “… no account of moral action can be complete without attending to the specific role of virtues in directing and motivating such actions, and…the applied context of IT ethics is a uniquely suitable domain for illustrating that necessity.”4 Vallor sketches several role-based traits relevant in the context of the applications of social networking technology: patience, honesty, empathy, fidelity, reciprocity and tolerance.5 Taking patience, for example, Vallor explains that patience is one of the most important virtues for sustaining close relationships and it develops through communicative activities such as listening. As an enduring element of one’s own character, it allows “a feeling on the part of others that you are willing to connect with them on their terms and not just yours, that your interest in them does not end with their ability to keep you constantly pleased or fascinated.”6 If you were a manager responsible for social networking technologies and their applications, you in all likelihood would be interested in patience as a trait, how it plays out in a social network, how to nurture user patience and what design features would facilitate these aims. Most IT managers at Twitter or LinkedIn, for example, would benefit from a deeper comprehension of patience as a trait.

The best way to comprehend the understanding of the role of virtue ethics, particularly role-based traits, in the information systems domain is to view every system as essentially an allocation of tasks between humans and machines based on their respective comparative advantage.7 Thus, all such systems are sociotechnological systems, in which “the reciprocal relationships of causation between technological structure and human agencies [occur, which] can account for the way the affordances of technological systems and the motivations and capacities of human users come together to determine moral outcomes.”8

Technology as an Enabler

As a technology professional, most of the time you are probably looking at technology as an enabler of ideas, visions, hopes and aspirations. These end goals can be good or bad. For most of us, the overwhelming perception—if not conviction—is that we are working on delivering something good—good for the business and good for the consumer of end products and services that the technology will enable. The third sphere of virtue ethics—eudaimonia, the sense of flourishing—also comes into play here as we think of the well-being of the company, the community and societies in general. You would be close to the thought in practice if, for example, you were working on an innovation in health care.

With specific reference to technology as an enabler, one example is the small telematic device promoted by auto insurance companies.9 If an auto driver opts to install the device in his car, he gets feedback regarding his driving behavior. The insurance company would reward him with a usage-based insurance premium, which may be much less than the traditional ways in which insurance companies determine his risk (their loss potential) for quoting auto insurance premiums.

As an IT manager responsible to drive this technology project, you would work with managers on the business side as well as on the technology side. You would want to capture the most relevant traits of a good driver. For this, you may need to capture the local geography, turns and roundabouts, signal lights, and stop signs along the way. And yet, you are concerned that the telematic does not reveal the exact location of the driver, that it does not capture GPS map locations along the way. Striking a balance between what is relevant for the business decision with what is fair treatment of the driver (e.g., in terms of privacy of location visited) is what you would seek. This product would likely cause considerable panic regarding the driver’s privacy because of the huge amount of shareable data. You would want to minimize these concerns and yet be truthful and transparent in your communication. These are all signs of role-constituted traits that you need to engage to deliver a holistic solution to the telematics project. Such traits may not be apparent from reading the code of ethics; nevertheless, their consideration is central to an IT manager acting as a moral agent in the project. If the only forces at work were marketing and revenue forecasts, the manager would likely fall far short on delivering goods that satisfy the sense of well-being (eudaimonia).

Motivations and Capacities of Human Users

Technology as an enabler is only a beginning. The reality is that the human user, motivated by incentives and fears, could respond differently from what was originally expected. The voluminous amount of data collected by the telematics is a cause for concern; the company might be legally responsible to provide such data to courts of law, for example. Some may reject the idea outright because of “big brother” fears, while others are tempted by the discounts they might earn. Some may feel that the system would help them become a better driver, as they respond to feedback from the system and change their driving habits. Drivers may also ponder the long-term consequences. For example, if a vast majority would become better drivers over time, would this result in lower income to the insurance company, which may prompt increases in premiums? Or if a driver cannot improve on his habits, would it mean that his premium may increase in the long run although there is no penalty imposed at this time?

Putting out a good product that delivers its stated objectives is challenging; the bigger challenge, however, is anticipating and responding to user behaviors. While aiding the company to prosper financially through business growth, the bigger idea is to reward safe drivers and coach, or punish, the bad ones. In the end, the number of lives that can be saved could spell the insurance industry’s moral bottom line. In the tally, hopefully, managers of the project have played a pivotal role—using their role-constituted traits implicit in the company’s generic-looking code of ethics.


1 Raval, V.; “Where the Rubber Meets the Road,” vol. 5, ISACA Journal, 2013
2 Ryle, G.; The Concept of Mind, Hutchinson & Company, UK, 1949
3 Some may think of ethics as avoiding the problem, thinking more in terms of complying with the rules of the organization (including the code of ethics) and the laws of the land. The assumption is that behavior that satisfies the rule book is ethical.
4 Vallor, S.; “Social Networking Technology and the Virtues,” Ethics and Information Technology, vol. 12, 2010, p. 160
5 ibid., p.164
6 ibid., p. 165
7 Raval, V.; “A Conceptual Model of A Curriculum for Accountants,” The Journal of Information Systems, Fall 1988, p. 132-152
8 Op cit, Vallor, p. 158
9 Scism, L.; “State Farm Is There: As You Drive,” Wall Street Journal, 5 August 2013, p. C1-C3

Vasant Raval, DBA, CISA, ACMA, is a professor of accountancy at Creighton University (Omaha, Nebraska, USA). The coauthor of two books on information systems and security, his areas of teaching and research interests include information security and corporate governance. Opinions expressed in this column are his own and not those of Creighton University. He can be reached at

Enjoying this article? To read the most current ISACA Journal articles, become a member or subscribe to the Journal.

The ISACA Journal is published by ISACA. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.

Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors’ employers, or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.

© 2013 ISACA. All rights reserved.

Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, MA 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.