An important attribute of IT governance is the development and maintenance of the capability to perform key IT processes. The IT function, working with the rest of the organization, must build a variety of capabilities to meet organizational and strategic objectives. Formalized methods for the identification and development of process maturity have been in existence for many years through the work of the Software Engineering Institute’s (SEI) Capability Maturity Model (CMM) and Capability Maturity Model Integration (CMMI). Achievement of process maturity is also a core element of the IT Governance Institute (ITGI)’s Control Objectives for Information and related Technology (COBIT), with a somewhat modified version of CMM playing a key role.
Process maturity in COBIT has five levels (plus zero) to measure the maturity of IT processes (zero being nonexistent, one being initial and ad hoc, two being repeatable but intuitive, three being defined, four being managed and measurable, and five being optimized). At level one, the COBIT framework notes that “there is evidence that the enterprise has recognized that the issues exist and need to be addressed. There are, however, no standardized processes; instead, there are ad hoc approaches that tend to be applied on an individual or case-by-case basis.” At the other end of the continuum, at level five, COBIT notes that “processes have been refined to a level of good practice, based on the results of continuous improvement and maturity modeling with other enterprises.”
Chief information officers (CIOs) and other executives know that it does not make economic sense to be at a level five maturity for every IT process, because the benefits of being at level five for every process could not justify the costs of achieving and maintaining that level of maturity. One would expect process maturity levels to vary for different IT processes, IT infrastructure and industry characteristics. For example, level two may be adequate for one IT process, but inappropriate for another, more critical IT process.
Differences in maturity come from factors such as the risks facing the organization and the contribution of processes to value generation and service delivery. IT managers must ask, “where should we be for our key processes?” or, at least, “how do we compare to our peers?”
ITGI provided funding for a large-scale international field study to develop quantifiable IT governance benchmarks. The study used COBIT’s business processes and definition of process maturity as the foundation for data collection. Fifty-one IT organizations were visited in Asia, Europe and North America. From these, process maturity data were collected from the owners of each process described in COBIT. Where data were collected from more than one person for a given process, the between-person variation was typically within one level of maturity. These data are, of course, self-reported and subject to bias, and the responses were not able to be validated independently. The number of level zero and one responses received indicates that the respondents were candid in the information provided. When a manager was interviewed on several processes, it was particularly important to ensure that the maturity levels be measured correctly by asking probing follow-up questions. Examples of these questions include, “how is management’s awareness of this process communicated to the IT organization for this process?” and “what are some of the tools and technologies supporting this process?” These detailed process maturity data were coupled with demographics about the organization and the IT function.
COBIT groups IT activities into 34 processes within four logical domains. Based on prior evaluations of process performance, five of the COBIT processes were divided into subprocesses because of the complexity and importance of the individual process (e.g., DS5 Ensure systems security) or because of markedly different concepts embedded within the process (e.g., the data classification and enterprise architecture concepts within PO2 Define the information architecture). As a result, a total of 41 processes were used for the project, as shown in figure 1.
These processes encompass the complete life cycle of IT investment, from strategic planning to the day-to-day operations of the IT function. COBIT recognizes that fulfilling the objectives of the enterprise requires development of systematic capabilities to deliver results on each of the IT processes. These capabilities require a combination of human, software and hardware resources bound together in a policy and procedure structure. Each of these resources requires careful monitoring through a collection of metrics and review to ensure that any given process is continuing to meet ongoing demands. As a result, COBIT recognizes that there are a number of dimensions or attributes of process maturity (figure 2).
In addition to collecting maturity levels, a separate questionnaire was used to interview the CIO to collect IT governance and demographic information for the organization. A wide variety of issues that had previously been identified as relevant to the study of IT governance was investigated. These included:
- The nature and extent of strategic and tactical alignment between IT and the rest of the organization
- The structure of the IT function in, for example, centralized, decentralized and so-called federal modes
- Adoption of IT governance processes and frameworks and demographic data on size, industry, spending, etc.
The organizations studied were relatively large because the study wanted organizations that were actively involved in all the various COBIT IT processes. The organizations averaged 172 IT staff and 3,132 clients (or workstations) (with a maximum of 15,000 clients). Most organizations in the study had mixed IT environments, with 98 percent using Wintel servers and 94 percent using UNIX. Mainframes were also used to varying degrees by a large minority (34 percent) of the sample organizations. Most, 74 percent, of the organizations had centralized IT governance, only 6 percent indicated a decentralized structure, and 20 percent indicated a federal structure. Although the range of outsourcing varied widely, 76 percent and 65 percent of respondents outsourced some aspects of their software or hardware functions, respectively. In terms of IT governance and IT management frameworks, COBIT and the IT Infrastructure Library (ITIL) were used in the organizations studied. However, only 16 percent and 10 percent, respectively, were intensive users of either COBIT or ITIL. Only three organizations, 6 percent, said that they thoroughly followed both COBIT and ITIL.
Figure 3 illustrates the results for each of the 34 processes in box plots.1 Three key points are evident in the figure. First, even large organizations can have level zero for some processes. Second, levels four and five are achievable. Third, taken as a whole, the Monitor and Evaluate (ME) domain had the lowest median levels and the lowest consensus (widest distribution of responses).
For the Plan and Organize (PO) domain, PO7 Manage IT human resources had the highest median level and PO2 Define the information architecture and PO8 Manage quality had the lowest median level. Both PO2 and PO8 require systemic change and significant emotional and monetary investment to achieve higher levels of maturity.
For the Acquire and Implement (AI) domain, AI5 Procure IT resources had the highest median level and AI4 Enable operation and use had the lowest. The level for AI5 is not surprising, given the maturity of corporate acquisition and procurement processes with which the IT function interacts. Conversely, AI4 extends to zero, which means that some interviewees believed that their IT shop had not even achieved the most basic level one maturity.
For the Deliver and Support (DS) Domain, DS12 Manage the physical environment had the highest median level and DS1 Define and manage service levels had the lowest. Both DS2 Manage third-party services and DS4 Ensure continuous service have whiskers extending to zero. On the other hand, DS6 Identify and allocate costs, DS7 Educate and train users, and DS12 have whiskers that extend almost to five.
For the Monitor and Evaluate (ME) domain, the median levels were relatively low compared to the other domains. Based on the interviews, many organizations had considerable difficulty in setting up systematic and formal monitoring processes. ME1 Monitor and evaluate IT performance had the highest median and the other processes were clustered near level 2. The ME processes had the widest distribution (lowest consensus). For ME2 Monitor and evaluate internal control and ME3 Ensure compliance with external requirements, the whiskers extend from zero to nearly five.
Individual Attributes Across Processes
As mentioned earlier, COBIT divides process maturation into six different attributes. Reviewing the general maturity levels for the six attributes across processes, the extremes are quite dramatic. The awareness attribute was in the top third for 68 percent of the processes. It was in the lowest third for only five percent of the processes. The goal attribute was in the lowest third for 68 percent of the processes and only in the top third for two percent of the processes. The other relatively high-level attribute was responsibility, which was in the top third for 51 percent of the processes. The other relatively low-level attribute was tools, which was in the bottom third for 56 percent of the processes.
Overall Organizational Performance
Next, the researchers explored the consistency in maturity across processes. First, each organization’s maturity was ranked, from one to 51, for each process. Second, the average ranks across all the processes were calculated. At the same time, the variation in ranks was calculated. Figure 4 illustrates the outcome. There are four distinct groups. The eight organizations in group D had consistently high levels across all processes. The five organizations in group A had consistently low levels across all processes. The bulk of the organizations were in group C. These are ranked in the middle, without a great deal of variation. Group B members are of average ranking in maturity and contain considerable variation in those rankings in contrast to group C. There appears to be no specific demographic that could be used to consistently predict the group into which an organization would fall. For example, each group included both large and small organizations and mixes of industries.
Additionally, the research report includes an analysis of the maturity results for different characteristics of the organizations, including country, industry, size of IT operations, IT spending as a percentage of revenue, alignment of business and IT goals, level of outsourcing, and IT governance structure.
Behind the Averages
Averages never tell the whole story. Drilling down to the numbers behind averages is required to get the whole story. The wide distribution of responses was intriguing considering that all of the 51 organizations were mature in the sense that they have a long history. Figure 5 shows the distribution of maturity levels for the six attributes. For the tools and goals attributes, 3.0 percent and 2.8 percent, respectively, indicated level zero. Combining levels zero and one, 28.5 percent and 28.0 percent selected those two levels for the tools and goals attributes, respectively. These were followed by 14.5 percent for policies and 14.3 percent for skills. Yet, many organizations were operating at levels four and five. At the high end, 36.5 percent selected level four or five for the awareness attribute and 30.1 percent selected four or five for the responsibilities attribute. Even for the attributes with high frequencies of level zero and one responses, the frequencies of level four and five responses were respectable: 21.9 percent for the tools attribute and 17.7 percent for the goals attribute.
While figure 5 summarizes the levels for each attribute, looking at the levels from a process perspective gives similar mixed results. For 33 processes, at least one person responded with a level zero for at least one attribute. The highest frequency of level zero was five (out of 51 organizations) for the tools attribute for PO4, and the highest frequency of level 4 was for the tools attribute for PO3, PO5 and ME3 and for the goal attribute for PO2 and ME2.
What does this all mean? For one thing, it indicates that maturity levels of four and five are achievable. On the other hand, because of the low levels for some processes and specific attributes in some processes, the first reaction might be to say that organizations should focus more resources on those processes and attributes to increase their maturity levels. However, one could argue that the levels of any of these processes evolved over time to their sufficient (or adequate or satisfactory) level. This would be called a satisficing strategy— where the goal is to achieve an adequate level as opposed to an optimum level. While not promoting this strategy, satisficing does appear to be the dominant strategy for many organizations and should not be rejected out of hand. In the extreme, this strategy is pejoratively called the firefighting strategy. Only with a self-assessment balanced with a careful risk assessment can organizations determine what their target levels should be—whether they are adequate (satisficing) levels or optimal levels.
Not every organization in the study has achieved the appropriate levels for every process, no matter what the current levels are. Instead, at an intuitive level, organizations could not justify the costs of pushing everything to a level five, but since levels four and five are achievable by a wide cross-section of organizations, this still leaves the question: At what levels should we be?
The complete report can be downloaded from www.isaca.org/deliverables. It provides more detailed maturity model information and explains how to conduct a self-assessment to compare an organization to the 51 organizations included in this study.
1 Box plots depict numerical data through five-number summaries: lowest observation, lower quartile (Q1), median, upper quartile (Q3), and highest observation. The interquartile range (IQR) is calculated by subtracting Q1 from Q3. The end points of the “whiskers” are 1.5 IQR lower than Q1 and 1.5 IQR higher than Q3. Outliers are shown as dots on the plots. A relatively long box indicates low consensus and a relatively short box indicates high consensus.
This article provides a brief synopsis of a 110-page research report funded by ITGI titled IT Governance and Process Maturity. The complete report is available at www.isaca.org/deliverables.
Roger Debreceny, Ph.D.
is the Shidler College distinguished professor of accounting in the Shidler College of Business, University of Hawaii at Manoa (USA). He teaches accounting, auditing and accounting. Prior to becoming an academic, he held senior finance positions in Asia. He has published widely in academic and professional journals, and is a past chair of the COBIT Steering Committee of the IT Governance Institute. He can be reached at firstname.lastname@example.org.
Glen L. Gray, Ph.D., CPA
is a professor in the accounting and IS department of the College of Business and Economics at California State University in Northridge, California, USA. He has conducted research projects funded by ISACA, The Institute of Internal Auditors, the American Institute of Certified Public Accountants and Big Four accounting firms, and has made presentations at academic and professional conferences worldwide. He can be reached at email@example.com.
ISACA Journal, formerly Information Systems Control Journal, is published by ISACA, a nonprofit organization created for the public in 1969. Membership in the association, a voluntary organization serving IT governance professionals, entitles one to receive an annual subscription to the ISACA Journal.
Opinions expressed in the ISACA Journal represent the views of the authors and advertisers. They may differ from policies and official statements of ISACA and/or the IT Governance Institute and their committees, and from opinions endorsed by authors, employers or the editors of this Journal. ISACA Journal does not attest to the originality of authors’ content.
Instructors are permitted to photocopy isolated articles for noncommercial classroom use without fee. For other copying, reprint or republication, permission must be obtained in writing from the association. Where necessary, permission is granted by the copyright owners for those registered with the Copyright Clearance Center (CCC), 27 Congress St., Salem, Mass. 01970, to photocopy articles owned by ISACA, for a flat fee of US $2.50 per article plus 25¢ per page. Send payment to the CCC stating the ISSN (1526-7407), date, volume, and first and last page number of each article. Copying for other than personal use or internal reference, or of articles or columns not owned by the association without express permission of the association or the copyright owner is expressly prohibited.
US: one year (6 issues) $75.00
All international orders: one year (6 issues) $90.00
Remittance must be made in US funds.