maestriainttelmexit.netmaestriainttelmexit.net/mediaresources/gobti/[21]...

102

Transcript of maestriainttelmexit.netmaestriainttelmexit.net/mediaresources/gobti/[21]...

IT Governance and Process Maturity

IT Governance Institute®

The IT Governance Institute (ITGITM) (www.itgi.org) is a non-profit, independent research entity that provides guidance for the global business community on issues related to the governance of IT assets. ITGI was established by the non-profit membership association ISACA in 1998 to help ensure that IT delivers value and its risks are mitigated through alignment with enterprise objectives, IT resources are properly allocated, and IT performance is measured. ITGI developed Control Objectives for Information and related Technology (CobiT®) and Val ITTM, and offers original research and case studies to help enterprise leaders and boards of directors fulfil their IT governance responsibilities and help IT professionals deliver value-adding services.

DisclaimerITGI has designed and created this publication, titled IT Governance and Process Maturity (the ‘Work’), primarily as an educational resource for control professionals. ITGI makes no claim that use of any of the Work will assure a successful outcome. The Work should not be considered inclusive of any proper information, procedures and tests or exclusive of other information, procedures and tests that are reasonably directed to obtaining the same results. In determining the propriety of any specific information procedure or test, control professionals should apply their own professional judgement to the specific control circumstances presented by the particular systems or information technology (IT) environment.

Reservation of Rights© 2008 ITGI. All rights reserved. No part of this publication may be used, copied, reproduced, modified, distributed, displayed, stored in a retrieval system or transmitted in any form by any means (electronic, mechanical, photocopying, recording or otherwise) without the prior written authorisation of ITGI. Reproduction and use of all portions of this publication are permitted solely for academic, internal and non-commercial use and for consulting/advisory engagements, and must include full attribution of the material’s source. No other right or permission is granted with respect to this work.

IT Governance Institute3701 Algonquin Road, Suite 1010 Rolling Meadows, IL 60008 USAPhone: +1.847.660.5700 Fax: +1.847.253.1443E-mail: [email protected] Web site: www.itgi.org

ISBN 978-1-60420-070-6IT Governance and Process MaturityPrinted in the United States of America

2 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Acknowledgements

Acknowledgements

ITGI wishes to recognise:

ResearchersRoger S. Debreceny, Ph.D., FCPA, Shidler College of Business, University of Hawaii at Manoa, USAGlen L. Gray, Ph.D., CPA, California State University at Northridge, USA

ITGI Board of TrusteesLynn Lawton, CISA, FBCS, FCA, FIIA, KPMG LLP, UK, International President George Ataya, CISA, CISM, CGEIT, CISSP, ICT Control SA, Belgium, Vice PresidentYonosuke Harada, CISA, CISM, CAIS, InfoCom Research Inc., Japan, Vice PresidentHoward Nicholson, CISA, CGEIT, City of Salisbury, Australia, Vice PresidentJose Angel Pena Ibarra, CGEIT, Consultoria en Comunicaciones e Info., SA & CV, Mexico, Vice PresidentRobert E. Stroud, CA Inc., USA, Vice PresidentKenneth L. Vander Wal, CISA, CPA, Ernst & Young LLP (retired), USA, Vice PresidentFrank Yam, CISA, FHKCS, FHKIoD, CIA, CCP, CFE, CFSA, FFA, Focus Strategic Group, Hong Kong, Vice PresidentMarios Damianides, CISA, CISM, CA, CPA, Ernst & Young LLP, USA, Past International PresidentEverett C. Johnson Jr., CPA, Deloitte & Touche LLP (retired), USA, Past International President

IT Governance CommitteeTony Hayes, FCPA, Queensland Government, Australia, ChairSushil Chatterji, Edutech Enterprises, SingaporeKyung-Tae Hwang, CISA, Dongguk University, KoreaJohn W. Lainhart IV, CISA, CISM, CGEIT, IBM Business Consulting Services, USAHugh Penri-Williams, CISA, CISM, CCSA, CIA, Glaniad 1865 EURL, FranceGustavo Adolfo Solis Montes, CISA, CISM, Grupo Cynthus, MexicoRobert E. Stroud, CA Inc., USAJohn Thorp, CMC, I.S.P., The Thorp Network Inc., CanadaWim Van Grembergen, Ph.D., University of Antwerp Management School, and IT Alignment and Governance (ITAG) Research

Institute, Belgium

CobiT Steering CommitteeRobert E. Stroud, CA Inc., USA, ChairGary S. Baker, CA, Deloitte & Touche, CanadaRafael Eduardo Fabius, CISA, Republica AFAP SA, UruguayErik Guldentops, CISA, CISM, University of Antwerp Management School, BelgiumJimmy Heschl, CISA, CISM, CGEIT, KPMG, AustriaDebbie A. Lew, CISA, Ernst & Young LLP, USAGreta Volders, Voquals, Belgium

ITGI Affiliates and SponsorsISACA chaptersAmerican Institute of Certified Public Accountants ASIS InternationalThe Center for Internet SecurityCommonwealth Association for Corporate Governance Inc.FIDA InformInformation Security ForumInformation Systems Security Association Institut de la Gouvernance des Systemes d’InformationInstitute of Management Accountants Inc.ISACA

3© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

Acknowledgements cont.ITGI JapanNorwich UniversitySocitm Performance Management GroupSolvay Business SchoolUniversity of Antwerp Management SchoolAldion Consulting Pte. Ltd.Analytix Holdings Pty. Ltd.BWise B.V.CA Inc.Consult2ComplyHewlett-PackardIBMITpreneurs Nederlands B.V.LogLogic Inc.Phoenix Business and Systems Process Inc.Project Rx Inc.Symantec Corp.TruArx Inc.Wolcott Group LLCWorld Pass IT Solutions

The researchers wish to recognise:We extend our sincere thanks to the 51 organisations and the more than 300 IT professionals who made this project successful. We gratefully acknowledge the generous financial support for the project from ITGI and the Shidler College of Business at the University of Hawaii at Manoa.

Our thanks also go to Scott Coolidge and his colleagues at Ernst and Young, including Warren Butuin, for providing some leads. Particular thanks go to Jimmy Heschl, KPMG and CobiT Steering Committee, for assistance and guidance on methodology.

The following additional individuals provided much support to the project and we are especially grateful for their assistance: Solomon Anastacio, CISA, Manila Electric Company, PhilippinesSushil Chatterji, Edutech, Singapore, SingaporeSteven De Haes, University of Antwerp, BelgiumCalix Enggay, CISA, CISM, CISSP, GSEC, Aboitiz Transport Systems Co., PhilippinesErik Guldentops, CISA, CISM, University of Antwerp Management School, BelgiumAbdul Hamid, CPA, LondonGary Hardy, CGEIT, IT Winners, South Africa Tony Hayes, FCPA, Queensland Government, AustraliaEverett C. Johnson, CPA, Deloitte & Touche LLP (retired), USA, Past International PresidentJohn W. Lainhart IV, CISA, CISM, CGEIT, IBM Business Consulting Services, USADebra Mallette, CISA, CSSBB, Kaiser Permanente, USAIsa Ojeda, CISA, PhilippinesMymy Rapes, ATSC Inc., PhilippinesGina Santos, CISA, CPA, Aboitiz Transport System Corp., PhilippinesMaxwell J. Shanahan, CISA, FCPA, Max Shanahan & Associates, AustraliaDirk E. Steuperaert, CISA, IT In Balance BVBA, BelgiumJohn Thorp, CMC, I.S.P., The Thorp Network, CanadaWim Van Grembergen, Ph.D., University of Antwerp Management School, and IT Alignment and Governance (ITAG)

Research Institute, BelgiumAlexander Zapata, CISA, Grupo Cynthus S.A. de C.V., Mexico

4 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Table of Contents

Table of ContentsExecutive Summary ...................................................................................................................................................................................7 Specific Processes and Attributes .........................................................................................................................................................7 Research Method ..................................................................................................................................................................................8 Study Sample ........................................................................................................................................................................................9 Research Findings .................................................................................................................................................................................9 A Closer Look at Process Maturity Levels ..........................................................................................................................................9 A Closer Look at Processes and Attributes ........................................................................................................................................11 Associating IT Governance With Process Maturity ...........................................................................................................................13 The Self-assessment Process ..............................................................................................................................................................16 Concluding Comments .......................................................................................................................................................................16

1. Introduction .......................................................................................................................................................................................18 Process Maturity Model in CobiT ......................................................................................................................................................19 Organisation of the Report .................................................................................................................................................................21

2. Study Method and Data Collection ..................................................................................................................................................22 Method ................................................................................................................................................................................................23 Collecting Process Maturity Levels ....................................................................................................................................................23 Collecting IT Governance and Demographic Data ............................................................................................................................24 Study Sample ......................................................................................................................................................................................24 IT Governance ....................................................................................................................................................................................26 Level of Outsourcing ..........................................................................................................................................................................26

3. Research Findings .............................................................................................................................................................................28 Overall Results ....................................................................................................................................................................................28 The Big Picture ...................................................................................................................................................................................28 Process-level Maturity With Each Domain ........................................................................................................................................29 Disaggregated Attribute Data by Process Maturity Domain ..............................................................................................................32

4. Process Maturity by Domain and Process .......................................................................................................................................34 Plan and Organise Domain .................................................................................................................................................................34 Acquire and Implement Domain ........................................................................................................................................................44 Deliver and Support Domain ..............................................................................................................................................................48 Monitor and Evaluate Domain ...........................................................................................................................................................57 Overall Enterprise Performance .........................................................................................................................................................60 Chapter Summary and Concluding Comments ..................................................................................................................................61

5. Associating IT Governance With Process Maturity .......................................................................................................................62 Geographic Location ...........................................................................................................................................................................63 Industry Classification ........................................................................................................................................................................65 Size of IT Organisations .....................................................................................................................................................................66 IT Spending as a Percentage of Revenue ...........................................................................................................................................67 Alignment of Business and IT Goals .................................................................................................................................................67 Level of Outsourcing ..........................................................................................................................................................................69 IT Governance Structure .....................................................................................................................................................................69 Concluding Comments on Statistical Analysis ..................................................................................................................................70

6. Next Steps in the Self-assessment Process ........................................................................................................................................71 Collecting Data ...................................................................................................................................................................................71 Comparing Results ..............................................................................................................................................................................72 Improving Process Maturity Levels ...................................................................................................................................................72

5© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

7. Conclusion ..........................................................................................................................................................................................73 From 0 to 5 and Everything in Between ............................................................................................................................................75

Appendix 1—Maturity Attribute Table .................................................................................................................................................76

Appendix 2—Details of Maturity Levels ...............................................................................................................................................77

List of Figures ..........................................................................................................................................................................................97

References ................................................................................................................................................................................................99

Other Publications .................................................................................................................................................................................100

6 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Executive Summary

Executive SummaryDeveloping and maintaining the ability to perform key IT processes are important aspects of achieving IT governance. One comprehensive approach to IT governance is the CobiT framework, which groups IT activities into 34 processes within four logical domains. These processes encompass the complete life cycle of IT investment, from strategic planning to the day-to-day operations of the IT function. A key concept in CobiT is the determination and systematic enhancement of process maturity, which has six levels (0 to 5) to measure the maturity of IT processes: • 0 Non-existent• 1 Initial/ad hoc • 2 Repeatable but intuitive• 3 Defined• 4 Managed and measurable• 5 Optimised

CobiT recognises that there are a number of dimensions or attributes of process maturity. These include management’s ‘awareness and communication’ of the process and the ‘policies, standards and procedures’ and ‘skills and expertise’ supporting the process.

Chief information officers (CIOs) and other executives know that it does not make economic sense to be at level 5 for every IT process because the benefits could not justify the costs of achieving and maintaining that level. In addition, target maturity levels would be expected to vary for individual IT processes, IT infrastructure and industry characteristics. Differences in maturity come from factors such as the risks facing the enterprise and the contribution of processes to value generation and service delivery. IT managers must ask the following question for each key process: Where should we be? Or, at least: How do we compare to our peers? Unfortunately, there are few data to help answer these questions, which is the motivation for this study.

The objectives of this study are to:• Collect process maturity data from a wide variety of enterprises to develop preliminary benchmarks for each maturity

attribute/IT process combination• Collect IT demographics to perform an initial analysis of process maturity measures vs. IT demographics as a starting point for

benchmarking profiles for different demographic combinations• Provide guidance for enterprises to conduct their own self-assessments to compare their process capability maturity measures

with benchmarks

Specific Processes and Attributes

As mentioned earlier, the CobiT framework provides complete IT governance coverage with its four domains and 34 IT processes. Based on prior evaluations of process performance, five of the CobiT processes were divided into subprocesses because of the complexity and importance of the process (e.g., DS5 Ensure systems security) or because of markedly different concepts embedded within the process (e.g., the data classification and enterprise architecture concepts within PO2 Define the information architecture). As a result, a total of 41 processes were used for the project, as shown in Figure 1.

7© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 1—41 CobiT IT Processes

Plan and Organise (PO) Domain Deliver and Support (DS) Domain

PO1 Define a strategic IT plan. DS1 Define and manage service levels.

PO2A Define the information architecture—Architecture. DS2 Manage third-party services.

PO2D Define the information architecture—Data classification. DS3 Manage performance and capacity.

PO3 Determine technological direction. DS4 Ensure continuous service.

PO4O Define the IT processes, organisation and relationships—Organisation. DS5NF Ensure systems security—Network and firewall.

PO4P Define the IT processes, organisation and relationships—Processes. DS5P Ensure systems security—Policy.

PO5B Manage the IT investment—Budgeting. DS5U Ensure systems security—User access.

PO5V Manage the IT investment—Value management. DS5V Ensure systems security—Virus.

PO6 Communicate management aims and direction. DS6 Identify and allocate costs.

IT Governance and Process Maturity

To more specifically define a process’s maturity, instead of having just one maturity level for each process, CobiT recognises that there are number of attributes of maturity. The six maturity attributes in CobiT are shown in figure 2.

Research Method

The principal research method for this project was a field study in which enterprises in several countries were visited. A wide variety of resources were used to identify CIOs who were willing to participate in the research project. For each enterprise, a total of 492 data points were collected. These represent both ‘as is’ (the current maturity level) and ‘to be’ (expected level in 12 months) maturity levels for each of the six attributes within the 41 processes (2 @ 6 @ 41 = 492).

The enterprise identified the process owners for each of the 41 processes. At the start of each interview, the researcher provided a description of the project and the interview protocol. The interviewees (the CIO or other senior IT manager) were given a copy of the generic process maturity attribute table. The columns listed the six attributes and the rows provided the scale from 0 to 5. Each cell included a brief description of the characteristics for each level of a specific attribute.

Taking one process at a time, the interviewer introduced the process control objectives and summarised the detailed control objective (from the CobiT documentation) for the interviewee. The interviewees read the attribute descriptions from the process maturity attribute table to themselves and then stated the ‘as is’ and ‘to be’ process maturity levels for each of the six attributes for that process. Regarding the ‘to be’ data, the point was stressed to the interviewee that they should not give a higher level just because they hoped capability would improve in the future. Instead, the point was to recognise that the maturity level might be expected to change in the future only if the IT function had a specific initiative in process and funding was obtained.

The interviewers also sought additional information and challenged managers on a subset of the processes. Examples of validation questions included ‘How is management’s awareness of this process communicated to the IT organisation for this process?’ and ‘What are some of the tools and technologies supporting this process?’ An established pattern was then extended to the other processes for the particular interviewee. This registration and validation process had to be handled judiciously because data collection could take an unacceptable amount of time if every data point were challenged.

8 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 1—41 CobiT IT Processes (cont.)

Plan and Organise (PO) Domain (cont.) Deliver and Support (DS) Domain (cont.)

PO7 Manage IT human resources. DS7 Educate and train users.

PO8 Manage quality. DS8 Manage service desk and incidents.

PO9 Assess and manage IT risks. DS9 Manage the configuration.

PO10PG Manage projects—Programme. DS10 Manage problems.

PO10PJ Manage projects—Projects. DS11 Manage data.

Acquire and Implement (AI) Domain DS12 Manage the physical environment.

AI1 Identify automated solutions. DS13 Manage operations.

AI2 Acquire and maintain application software. Monitor and Evaluate (ME) Domain

AI3 Acquire and maintain technology infrastructure. ME1 Monitor and evaluate IT performance.

AI4 Enable operation and use. ME2 Monitor and evaluate internal control.

AI5 Procure IT resources. ME3 Ensure compliance with external requirements.

AI6 Manage changes. ME4 Provide IT governance.

AI7 Install and accredit solutions and changes.

Figure 2—Attributes of Process Maturity

Attributes Abbreviated References Used in This Publication

Awareness and communication Awareness

Policies, standards and procedures Policies

Tools and automation Technology

Skills and expertise Skills

Responsibility and accountability Responsibility

Goal setting and measurement Goals

Executive Summary

A separate comprehensive questionnaire was used to interview CIOs to collect IT governance and demographic information for the enterprise. A wide variety of issues were investigated that have been identified previously as relevant to the study of IT governance.

Study Sample

A total of 51 enterprises from North America, Asia and Europe participated in this study. Because enterprises that were actively involved in the 41 IT processes were sought, the 51 enterprises were rather large. They averaged 172 IT staff members and the largest had 690 IT employees. In terms of numbers of clients (or workstations), the average was 3,132 and the highest was 15,000. Most enterprises in the study had mixed IT environments, with 98 percent using Wintel servers and 94 percent using UNIX. Mainframes were used to varying degrees by a large minority (34 percent) of the sample enterprises. Nearly three-quarters (74 percent) of the enterprises had centralised IT governance, 6 percent indicated a decentralised structure, and 20 percent indicated a federal structure. The majority of the enterprises in the study outsourced some aspects of their IT activities. Although the range of outsourcing varied widely, 76 percent of respondents outsourced some aspects of their software functions and 65 percent outsourced some aspects of their hardware functions.

In terms of IT governance and IT management frameworks, both CobiT and IT Infrastructure Library (ITIL) were used in the enterprises studied. However, only 16 percent and 10 percent, respectively, were intensive users of either CobiT or ITIL. Only three enterprises (6 percent) said that they thoroughly followed both CobiT and ITIL.

Research Findings

Figure 3 illustrates highly aggregated data for the four process domains in the form of box plots.1 Four points are evident in the figure. First, maturity levels were expected to increase by approximately 0.5 in the next 12 months. Second, the Monitor and Evaluate (ME) domain had the lowest maturity levels and the lowest consensus of all the domains. The ‘as is’ responses for the Deliver and Support (DS) domain had the widest range of responses, extending from 0 to almost 5, which illustrates the third and fourth points: even large, mature enterprises can be at level 0, and level 5 is achievable. These four points will be expanded on with more disaggregated data in the following discussions.

A Closer Look at Process Maturity Levels

Figure 4 presents the 34 processes included in the four domains. For the Plan and Organise (PO) domain, the median maturity levels range from approximately 1.7 to 2.7, with PO7 Manage IT human resources having the highest median and PO2 Define the information architecture and PO8 Manage quality having the lowest medians. An interesting aspect of the figure is that the low-end points of the ‘whiskers’ are lower than level 1 for nine of the 10 processes. This illustrates two points: The maturity of processes can be very low, even in large, mature enterprises; and the subjects were candid in their answers if they were willing to give the researchers such low numeric answers.

While there are many processes that have some interaction with corporate processes and controls outside the IT function, there are three processes in CobiT that are particularly tied to generic enterprise processes. For example, the process PO5 Manage the IT investment links with finance and accounting processes within the enterprise, at least with respect to the budgeting and expenditure control components of the process. PO7 Manage IT human resources links with corporate human resources processes. Finally, AI5 Procure IT resources links with broader procurement processes. The tight couplings between these IT processes and processes outside of IT could explain their relatively high maturity.

Conversely, the processes in the PO domain showing the lowest maturity are particularly challenging for IT organisations. PO2 Define the information architecture and PO8 Manage quality each require systemic change and significant human resources and monetary investment to achieve higher levels of maturity.

9© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

1 Box plots depict numerical data through the box plot’s five-number summaries: The lowest observation, lower quartile (Q1), median, upper quartile (Q3) and highest observation. The interquartile range (IQR) is calculated by subtracting Q1 from Q3. The end points of the ‘whiskers’ are 1.5 IQR lower than Q1 and 1.5 IQR higher than Q3. Outliers are shown as dots on the plots. A relatively long box indicates low consensus and a relatively short box indicates high consensus.

IT Governance and Process Maturity

10 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

0 1 2 3 4 5 Process Maturity

Process Maturity

Note: ‘As Is’ Process Maturity

Note: ‘As Is’ Process Maturity

AI1

AI2

AI3

AI4

AI5

AI6

AI7

0 1 2 3 4 5 Process Maturity

Note: ‘As Is’ Process Maturity

PO1

PO2

PO3

PO4

PO5

PO6

PO7

PO8

PO9

PO10

0 1 2 3 4 5

ME1

ME2

ME3

ME4

Process Maturity Note: ‘As Is’ Process Maturity

0 1 2 3 4 5

DS1

DS2

DS3

DS4

DS5

DS6

DS7

DS8

DS9

DS10

DS11

DS12

DS13

Figure 4—Box Plots for the 34 IT Processes

5 Pr

oces

s M

atur

ity4

3 2

1 0

As Is To Be As Is To Be As Is To Be As Is To Be

PO AI DS ME

Figure 3—Overall Process Maturity by Domain

Executive Summary

Finally, in this discussion of the PO domain, the results for PO10 Manage projects are notably high—with one important caveat. PO10 is the average of PO10PG Manage projects—Programme management and PO10PJ Manage projects—Project management. PO10PG has a very different and much lower level of maturity than PO10PJ. This shows that there is still much to be done in the management of IT investment programmes.

For the Acquire and Implement (AI) domain, the median maturity levels range from approximately 2.4 to 3.0, with AI5 Procure IT resources having the highest median level of process capability maturity and AI4 Enable operation and use the lowest. The maturity level for AI5 is not surprising, given the maturity of enterprise acquisition and procurement processes with which the IT function interacts.

Conversely, the result for AI4 is particularly interesting. The whisker extends all the way to the zero level, indicating that some interviewees believed that their IT organisation had not even achieved the most basic level of maturity of 1. As indicated previously, this was more than a little surprising considering the size (and general maturity) of the IT organisations included in this study.

For the Deliver and Support (DS) domain, the median maturity levels range from approximately 2.0 to 3.2, with DS12 Manage the physical environment having the highest median and DS1 Define and manage service levels having the lowest. Both DS2 Manage third-party services and DS4 Ensure continuous service have whiskers extending all the way to zero.

On the other hand, DS6 Identify and allocate costs, DS7 Educate and train users, and DS12 Manage the physical environment have whiskers that extend to almost 5, indicating that they have attained the highest level of maturity at some enterprises. This raises an interesting question: What does it mean that nobody reported a level 5 for some processes? Does that mean that a level 5 cannot be achieved for those processes? Or does it really mean that it is achievable, but nobody has pushed to that level because the perceived benefits of achieving the equivalent of a 5 do not justify the costs? These important questions should be addressed in future research.

It seems clear that many IT organisations have worked hard over the years to manage numerous day-to-day operational activities. The confidence with which managers addressed the management and controls is impressive in processes such as DS11 Manage data, DS12 Manage the physical environment and DS13 Manage operations. These seem to be processes that have received a great deal of attention over many years. Similarly, there was considerable evidence that much investment has gone into each of the aspects of security. As can be seen in figure 4, while divergence was relatively low, there were outliers at either end of the distribution.

For the Monitor and Evaluate (ME) domain, the median maturity levels were relatively low and close. The perceived maturity levels range from approximately 2.0 to 2.2. ME1 Monitor and evaluate IT performance had the highest median and the other processes were clustered near 2.0. As can be seen in figure 4, the processes in the ME domain had the widest distribution (lowest consensus). For ME2 Monitor and evaluate internal control and ME3 Ensure compliance with external requirements, the whiskers extend from 0 to nearly 5. As will be discussed in more detail in chapter 3, many enterprises had considerable difficulty in setting up systematic and formal monitoring processes.

A Closer Look at Processes and Attributes

As mentioned previously, CobiT divides process maturity into six different attributes.

Figure 5 illustrates the findings in the form of a heat map based on the responses from the 51 enterprises. The red cells represent the top third of maturity responses, the pink cells represent the middle third, and the white cells represent the bottom third.

ProcessesComparing the rows (processes) first, PO5B Manage the IT investment—Budgeting is red (in highest third) across all six attributes, indicating the highest level of maturity. Since most large enterprises, like those in this study have a long history of a formal budgeting process for the entire organisation, it is not a surprise that the budget process had this high level of maturity. The enterprise-wide process is probably highly structured with routine processes and budget templates to be completed each year.

PO5B was the only process that was red for all attributes, but several processes were at the highest level for five attributes and medium for the goal attribute, including:• DS5NF Ensure systems security—Network and firewall• DS5V Ensure systems security—Virus• DS8 Manage service desk and incidents• DS11 Manage data• DS12 Manage the physical environment

11© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

12 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 5—Summary Heat Map

Processes Awareness Policies Technology Skills Responsibility Goals

PO1 Define a strategic IT plan.

PO2A Define the information architecture—Architecture.

PO2D Define the information architecture—Data classification.

PO3 Determine technological direction.

PO4P Define the IT processes, organisation and relationships—Processes.

PO4O Define the IT processes, organisation and relationships—Organisation.

PO5B Manage the IT investment—Budgeting.

PO5V Manage the IT investment—Value management.

PO6 Communicate management aims and direction.

PO7 Manage IT human resources.

PO8 Manage quality.

PO9 Assess and manage IT risks.

PO10PG Manage projects—Programme.

PO10PJ Manage projects—Projects.

AI1 Identify automated solutions.

AI2 Acquire and maintain application software.

AI3 Acquire and maintain technology infrastructure.

AI4 Enable operation and use.

AI5 Procure IT resources.

AI6 Manage changes.

AI7 Install and accredit solutions and changes.

DS1 Define and manage service levels.

DS2 Manage third-party services.

DS3 Manage performance and capacity.

DS4 Ensure continuous service.

DS5NF Ensure systems security—Network and firewall.

DS5P Ensure systems security—Policy.

DS5U Ensure systems security—User access.

DS5V Ensure systems security—Virus.

DS6 Identify and allocate costs.

DS7 Educate and train users.

DS8 Manage service desk and incidents.

DS9 Manage the configuration.

DS10 Manage problems.

DS11 Manage data.

DS12 Manage the physical environment.

DS13 Manage operations.

ME1 Monitor and evaluate IT performance.

ME2 Monitor and evaluate internal control.

ME3 Ensure compliance with external requirements.

ME4 Provide IT governance.

Executive Summary

It is interesting to note that all of these processes are in the Deliver and Support (DS) domain. In many ways, these are daily operational issues. An enterprise’s networks are constantly under attack from the outside, so firewall (DS5NF) and virus (DS5V) processes must be mature or the enterprise will quickly find its networks infected and shut down. Service desks (DS8) are being inundated with questions from users—sometimes 24/7. If these desks are not operating at a high level of efficiency, productivity of the entire enterprise will suffer. In most large enterprises, databases (DS11) are highly integrated and shared by many applications, so any problems with data integrity and access will quick pervade and cripple the entire organisation. Finally, Manage the physical environment (DS12) has a long history, going back to major environmental control issues associated with mainframe computers; these types of processes and associated problems will be highly visible.

At the other end of the spectrum, PO2D Architecture—Data classification and PO5V IT investment—Value management are white (in the lowest third) for all six attributes. A reason that PO2D is at the low end may be that the process of classifying data requires integration of different technology and application platforms with roles and responsibilities covering both IT and the business. PO5V is the process of determining ‘How does IT add value to the enterprise?’ Management of value generation by IT is challenging and multidimensional. The formal results reinforced by discussions with IT managers during data collection indicates that there is little emphasis on answering these value questions in most enterprises.

Moving up only slightly, two processes were in the lowest third for five attributes: PO8 Manage quality and DS1 Define and manage service levels. The low level of these two processes was unexpected considering the emphasis given to quality control in IT literature. It indicates that these two processes have not moved much past the ad hoc level.

AttributesA review of the columns in figure 5 indicates the general maturity levels for the six attributes, and the extremes are quite dramatic. The awareness attribute was in the top third for 28 (68 percent) of the 41 processes. It was in the lowest third for only two (5 percent) of the 41 processes. On the other hand, the goals attribute was essentially the mirror image of the awareness attribute; it was in the lowest third for 28 (68 percent) of the processes and in the top third for only one (2 percent) of the processes.

The other relatively high-level attribute was responsibility, which was in the top third for 51 percent of the processes. The other relatively low-level attribute was technology, which was in the bottom third for 56 percent of the processes. The technology attribute essentially addresses the level to which tools and automation are used to manage the enterprise’s technology. Similar to the cliché that the shoemaker’s children have no shoes, it is interesting that the use of technology is not more mature in the technology domain.

Since it is probably easier (less expensive) to increase one level at a lower level (e.g., going from level 1 to level 2) than at higher levels (e.g., going from level 4 to level 5), focusing on improving the low-maturity attributes first could be the easiest way to improve overall process maturity.

Associating IT Governance With Process Maturity

Up to this point in the report, the research results were presented for the 51 enterprises taken as one group. This section reports on a statistical analysis of the data to determine whether the results varied by the different characteristics of the enterprises, including:• Geographic location• Industry• Size of the IT organisation• IT spending as a percentage of revenue• Alignment of business and IT goals• Level of outsourcing• IT governance structure

Figure 6 summarises the numbers of processes and attributes that were statistically different with regard to the demographic variables listed previously. As the following paragraphs will explain, some of the findings were surprising and intriguing, and provide some interesting questions for future research.

13© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

Geographic LocationThe 51 enterprises that participated in this study were located in the following countries or regions:• Austria/Germany/Switzerland• Canada• Mexico• Philippines• Singapore• USA

The means (averages) were statistically different for 11 of the 41 processes, with Canadian and Singaporean enterprises having the highest means and Mexican and Philippine enterprises having the lowest. The widest difference was for DS6 Identify and allocate costs; Canadian enterprises averaged a maturity level of 3.9 and Mexican enterprises averaged 1.2.

The finding that only approximately 25 percent of the process means were statistically different between countries was a little surprising considering the cultural, business and regulatory environmental differences amongst these locations. There are several possible reasons for the general lack of differences, including:• IT organisations in more developed countries have reached a plateau and, in the intervening years, IT operations in emerging

countries have caught up and are reaching a similar plateau.• Many hardware and software brands (e.g., IBM, SAP, Microsoft, Intel, Dell) know no borders and are universal, de facto standards,

so newer IT organisations have less of a learning curve to implement technology. In addition, large IT organisations have access to the same professional organisations, IT literature and consultants to accelerate the implementation of best (or, at least, better) practices.

• Because the number of enterprises from each country was relatively small and not taken from random samples, the sample from each country may not be a true representation of process maturity levels for that country’s IT organisations in general.

IndustryThe 51 enterprises were grouped into the following broad industry classifications:• Capital-intensive industries (other than utilities)• Utilities• Service industries• Financial institutions• Government and non-profits

As figure 5 indicates, only two processes had statistically different means. The two processes were ME3 Ensure compliance with external requirements and ME4 Provide IT governance. This result was surprising because certain processes are more critical to some industries and they would be expected to be more mature. The different level and nature of government regulations in some industries would also be expected to impact IT processes. However, the statistical results imply that differences in process maturity amongst enterprises in this study are not correlated with differences in industry characteristics. Or said another way, any systematic differences amongst enterprises are more strongly due to variables other than industry classifications.

14 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 6—Summary of Statistical Analysis

VariablesProcesses (out of 41)

Attributes (out of 6)

Geographic location 11 5

Industry 2 3

Size of IT organisations 23 6

IT spending as a percentage of revenue 4 0

Alignment of business and IT goals 37 6

Level of outsourcing 3 0

IT governance structure 4 6

Executive Summary

The means for ME3 and ME4 were highest for financial institutions and lowest for capital-intensive enterprises.

Regarding the attributes, awareness, policies and skills were significantly different amongst industries. Utilities had the highest numbers for those attributes and capital-intensive enterprises had the lowest numbers.

Size of IT OrganisationsThe 51 enterprises in the study were classified as low or high based on whether their size was below or above the median size measure. For size, the means for the 23 processes were statistically different. The larger enterprises had, as might be expected, higher maturity levels. None of the means for the smaller enterprises broke above 3, whereas for larger enterprises, 12 of the 23 means were 3 or above. In general, larger enterprises probably have longer histories than smaller enterprises, and larger enterprises probably also have more capital to invest in fine-tuning their IT processes. All of the attributes were statistically different and the means for larger enterprises were 20 to 30 percent higher than the means for smaller enterprises.

IT Spending as a Percentage of RevenueFor IT spending, the values for only four processes were statistically different, with the means always being higher for those enterprises with the lower percentages. At first this may seem counter-intuitive, but the key is that IT spending is measured relative to revenue—not in absolute terms. That is, a low percentage does not necessarily indicate a smaller enterprise. In fact, the opposite might be true. Because IT expenditures typically have a high fixed-cost component, a smaller enterprise would have higher percentages allocated to cover the relatively higher fixed costs. None of the attributes was statistically different for different levels of IT spending as a percentage of revenue.

Alignment of Business and IT GoalsOne of the major themes in CobiT is the importance of aligning IT goals with business goals. In this study, through a series of questions, CIOs were asked to develop an alignment metric, which was used to classify enterprises as being in ‘low’ and ‘high’ alignment. For alignment, the values for 37 processes were statistically different and were always higher for the higher-alignment enterprises. For five of the processes, the differences between the means were one full level or more.

Level of OutsourcingThe enterprises were evenly divided based on their relative levels of outsourcing. The values for only three processes were statistically different. The means for those three processes were mixed; PO5B and DS6 (both of which relate to IT budgeting and costs) were more mature for low-level outsourcers and AI3 (relating to technology infrastructure) was higher for high-level outsourcers. None of the attributes was statistically different for low-level and high-level outsourcers.

IT Governance StructureIT governance structures were classified as centralised, decentralised or federal. For this variable, only four processes were statistically different. For three of the processes, the means were higher for decentralised enterprises. Enterprises with federal structures had the lowest means for the four processes. All of the attributes were statistically different for the different structures, and the means were always highest for enterprises with decentralised structures.

Concluding Comments on Statistical AnalysisAs mentioned previously, there were expectations before performing the statistical analysis as to where differences were likely to be found. For example, because cultures, management practices and regulatory environments vary so much amongst countries, it was expected that process maturities would also vary significantly amongst countries. Instead, it was found that only about 25 percent of the processes were statistically different. One of the biggest surprises was the industry classification results where only two processes were statistically different. Another surprise was the discriminating power of business/IT alignment—37 processes and all attributes were statistically different.

The reader is cautioned that all of these findings should be considered preliminary. Taken as a whole, the 51 enterprises provided a diversified sample of enterprises; however, the enterprises were all volunteers—not a random sample from the population of interest—so care must be taken when generalising the results from this study. Subdividing the 51 enterprises into smaller groups to conduct between-group statistical analysis also puts some limitations on the ability to generalise these results. Additional research will be needed to confirm or disconfirm these preliminary results.

15© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

The Self-assessment Process

One of the most common questions regarding IT is ‘How do we compare to our peers?’ This question permeates the entire enterprise, inside and outside of the IT function. Two objectives of this study were to collect benchmark information and determine whether the information varies by demographic variables so that enterprises have a foundation on which to perform their own process maturity self-assessment. Those objectives were accomplished. The next step is for enterprises to conduct their own self-assessments to determine how they compare to their peers.

The key to self-assessment is the quality of the collected data. To help ensure the accuracy of the collected data, the people providing the maturity levels must have intimate knowledge of their processes so they can ascertain their current process capability maturity levels. It is equally important that they be candid in their assessments. They must be assured that any low numbers will not reflect badly on them. It helps to ask challenge questions (e.g., ‘Can you give me an example of…?’) to help ensure that the assessments are accurate.

Basic self-assessment begins with collecting maturity levels for the six attributes of the 41 processes to create a table similar to figure 90 in appendix 2. For maturity levels that are below the averages in this report, the next step is to drill down into the data to help pinpoint specific processes and attributes that are potential problem areas. The list of process/attribute combinations should then be prioritised based on a risk assessment. Finally, target levels should be established for the process/attribute combinations that most need improvement and actions needed to achieve the targeted process capability maturity levels should be determined.

Concluding Comments

The primary objective of this project was to collect process maturity data in a controlled fashion to provide benchmarks for enterprises to subsequently perform their own self-assessment and gap analysis to determine specific processes most in need of enhancment. With the help of ISACA members, two major accounting professional services firms and other contacts, information was compiled from 51 enterprises, representing a wide variety of nationalities and industries. This breadth of data collection of process maturity data was unprecedented.

At first blush, looking at the average maturity levels for the processes, the difference between the lowest (1.9 for PO5V Manage the IT investment—Value management) and the highest (3.1 for DS5V Ensure systems security—Virus) does not seem very significant—essentially the difference of one level. However, drilling down into the data shows some surprising and intriguing results.

Attributes Drilling down to the individual attribute levels, the differences are quite dramatic. The awareness attribute was in the top third of levels for 68 percent of the 41 processes and in the lowest third for only 5 percent of the processes. On the other hand, the goals attribute was essentially the mirror image of the awareness attribute, being in the lowest third of levels for 68 percent of the processes and in the top third for only 2 percent of the processes. The other relatively high-level attribute was responsibility, which was in the top third for 51 percent of the processes. The other relatively low-low attribute was technology, which was in the bottom third for 56 percent of the processes.

Since it is probably easier (less expensive) to increase one level at a lower level (e.g., going from level 1 to level 2) than at higher levels (e.g., going from level 4 to level 5), focusing on improving the low-maturity attributes first could be the easiest way to improve overall IT process maturity.

Statistical AnalysisThe statistical analysis presented previously shows some areas where drilling down may have the highest payoff. The degree of alignment between business and IT goals had the greatest discriminating power, with 37 processes being statistically different for different levels of alignment. The importance of business and IT alignment is one of the critical themes of CobiT, so it is gratifying to see that idea reinforced in the research. Industry classification, level of outsourcing and IT spending as a percentage of revenue had the least discriminating power. Alignment of business and IT goals, size of IT organisation and geographic location had the greatest discriminating power.

16 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Executive Summary

Behind the AveragesIn the self-assessment discussion, one of the steps listed was to compare the self-assessment results to the averages in this report; however, averages never tell the whole story. Drilling down to the numbers behind those averages is required to get the whole story. The wide distribution of responses was intriguing, considering that all of the 51 enterprises were mature (because they have a long history). Figure 7 shows the distribution of maturity levels for the six attributes. We see that 3.0 percent and 2.8 percent of enterprises indicated a maturity level of 0 for the technology and goals attributes, respectively. Combining levels 0 and 1 shows that 28.5 percent and 28.0 percent of enterprises selected levels 0 and 1 for the technology and goals attributes, respectively. These were followed by 14.5 percent for the policies attribute and 14.3 percent for the skills attribute. On the other hand, many enterprises were operating at levels 4 and 5. At the high end, 36.5 percent of enterprises selected 4 or 5 for the awareness attribute and 30.1 percent selected 4 or 5 for the responsibilities attribute. Even for the attributes with high frequencies of levels 0 and 1 (technology and goals), the frequencies of 4 and 5 were respectable: 21.9 percent for the technology attribute and 17.7 percent for the goals attribute.

While figure 7 summarises the levels at the attribute level, looking at the levels from a process perspective gives similar mixed results. For 33 processes, at least one person stated a level 0 for at least one attribute. The highest frequency of level 0 was five (out of 51 enterprises) for the technology attribute for PO4P and level 4 for the technology attribute for PO3, PO5V and ME3 and for the goals attribute for PO2D and ME2.

What does this all mean? For one thing, it indicates that maturity levels 4 and 5 are achievable. On the other hand, because of the low levels for some processes and specific attributes in some processes, the first reaction might be to say that enterprises should focus more resources on those processes and attributes to increase their maturity levels. However, one could argue that the levels of any of these processes evolved over time to their sufficient level. This is called a satisficing strategy, where the goal is to achieve an adequate level as opposed to an optimum level. This strategy is not being promoted; however, satisficing does appear to be the dominant strategy for many enterprises and should not be rejected out of hand. In the extreme, this strategy is pejoratively called the fire-fighting strategy. Only with a self-assessment balanced with a careful risk assessment can enterprises determine what their target levels should be—and whether they are adequate (satisficing) levels or optimal levels.

This study does not say that every enterprise has achieved the appropriate levels for every process no matter the current levels. Instead, it is revisiting the point made at the beginning of the report that, at an intutive level, enterprises cannot justify the costs of pushing everything to a level 5. However, since levels 4 and 5 are acheivable by a wide cross-section of enterprises, that still leaves the question, ‘At what levels should we be?’

This project achieved the research objective of developing robust benchmark information and providing a means for enterprises to answer the question, ‘How do we compare with our peers?’ Future research can build on this research to answer the more normative question, ‘At what levels should we be?’ By conducting focus groups, case studies and surveys, deeper and wider data can be collected to move from a satisficing strategy to an optimising strategy.

17© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 7—Distribution of Maturity Levels

Attributes

Maturity Levels

0 1 2 3 4 5

Awareness 0.4% 8.3% 22.3% 32.6% 29.0% 7.5%

Policies 0.8% 13.7% 30.6% 27.5% 21.0% 6.3%

Technology 3.0% 25.5% 26.6% 23.0% 17.4% 4.5%

Skills 0.3% 14.0% 32.7% 32.2% 16.7% 4.1%

Responsibilities 0.4% 10.9% 26.0% 32.7% 23.2% 6.9%

Goals 2.8% 25.2% 29.9% 24.4% 14.9% 2.8%

IT Governance and Process Maturity

18 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

1. IntroductionDeveloping and maintaining the ability to perform key IT processes are important aspects of achieving IT governance. The IT function, working with the rest of the enterprise, must build a variety of capabilities to meet enterprise strategic objectives. These capabilities bring together internal and external human resources, software applications, hardware, and other resources in a systematic fashion to achieve a desired outcome. These outcomes may be strategic in nature, such as the determination of future direction for the IT function; tactical, such as providing customer service from a help desk; or operational, such as installing a systematic process for the backup and storage of data.

One comprehensive approach to IT governance is the CobiT framework, which groups IT activities into 34 processes within four logical domains.2 These processes encompass the complete life cycle of IT investment, from strategic planning to the day-to-day operations of the IT function. The relative importance of these processes differs amongst enterprises, depending on the objectives and the risk and reward environment for the enterprise. CobiT3 provides a methodology that links business goals to IT goals and then to IT processes. This allows enterprises to determine the more significant IT processes. ITGI’s complementary Val IT framework (Enterprise Value: Governance of IT Investments, The Val IT Framework 2.0) also includes managerial tools that assist enterprises in managing the generation of value from their investment in information technologies.

A key concept in CobiT is the determination and systematic enhancement of process maturity. CobiT recognises that fulfilling the objectives of the enterprise requires development of systematic capabilities to deliver results on each of the IT processes. These capabilities require a combination of human, software and hardware resources bound together in a policy and procedure structure. Each of these resources requires careful monitoring through a collection of metrics and review to ensure that any given process is continuing to meet ongoing demands. The level of enterprises’ capabilities to undertake the various IT processes within the CobiT framework differs enormously. Within a particular IT function, there can also be a considerable difference in its capabilities across the range of IT processes.

The concept of process maturity in CobiT draws directly from the Software Engineering Institute’s (SEI’s) Capability Maturity Model (CMM).4, 5 CMM has six levels (including 0) to measure the maturity of IT processes: • 0 Non-existent• 1 Initial• 2 Repeatable• 3 Defined • 4 Managed• 5 Optimised

CobiT builds and expands on CMM to accommodate the diversity of processes within the CobiT framework.

A process that is at maturity6 level 1 will be managed in a largely ad hoc fashion. There may be an unstructured allocation of resources, little or no defined policies and procedures, and no performance measurement. Conversely, a process that is at level 5 will be characterised by significant managerial attention and resources, robust and thoroughly communicated policies and procedures, use of industry best practices, and self-improvements as a result of a systematic performance measurement and analysis feedback loop.

CIOs and other executives know that it does not make economic sense to be at level 5 maturity for every IT process because the benefits could not justify the costs of achieving and maintaining that level of maturity. In addition, the target maturity levels would be expected to vary for different individual IT processes, IT infrastructures and industry characteristics. For example, it may be adequate to be at level 2 for one IT process, but it would be highly inappropriate for a more critical IT process.

18 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

2 See also www.isaca.org/cobit, the Board Briefing on IT Governance, 2nd Edition (ITGI 2003), and the IT Governance Implementation Guide (ITGI 2007). An overview of governance over IT is beyond the scope of this report. See, for example, Bloem et al. (2005); Van Grembergen and De Haes (2008) and Weill and Ross (2004).

3 Data collection for the study was undertaken during the migration from CobiT 4.0 to CobiT 4.1. CobiT 4.0 was used so as not to bias the results and updated here to CobiT 4.1.

4 While the CobiT framework encompasses the complete life cycle of IT investment, the SEI’s CMM is firmly rooted in the limited domain of improving the quality of software development. An important historical influence in the development of CMM is the concept of quality conformance and embedding quality into all of the stages of software development and deployment. This was strongly influenced by the quality assurance movement in manufacturing, particularly associated with Deming and Juran. CMM and CMMI (CMM Integration) build directly upon the work of Humphrey (Humphrey 1997, 1989).

5 CMM has now been retired by the SEI and replaced by CMMI. The operational details of CMM and CMMI are beyond the scope of this report. Interested readers may refer to www.sei.cmu.edu/cmmi/general/ and to the extensive literature on CMM and CMMI (e.g., Caputo 1998; Dymond 1995; Raynus 1999; Ahern et al., 2004; Chrissis et al., 2007; Dymond 2007; Garcia and Turner 2007; Kasse 2004). ITGI has provided mappings of CobiT 4.0 to both CMM and CMMI (ITGI 2006, 2007).

6 In CMMI, ‘capability level’ is used to designate process-level maturity. Further, ‘maturity level’ is used to designate enterprise-wide maturity. In line with CobiT, the term ‘maturity’ refers to process maturity.

1. Introduction

19© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

For enterprises to set their own target levels, they must determine answers to questions such as the following:• Where are we now? What is our current process maturity level for each of our IT processes?• At what levels are our peers? What is the level of process maturity level for similar enterprises? • Where should we be? At what maturity level should we be? Overall? For given processes?• What are our critical gaps? What are the gaps between our current process maturity and where we should be? How quickly should

we resolve these gaps? How should we prioritise resolving those gaps?• What will it take? What resources will it take to increase process maturity levels? How do we obtain those resources (e.g., reallocate

resources in the current IT budget and/or request additional resources for the IT budget)?• What is the return on investment? What will be the return on investment from increasing maturity levels?

Unfortunately, there is very little information to help answer each of these questions.7 The current study is a response to this need. The objectives of the study are to:• Collect process maturity data from a wide variety of enterprises to develop preliminary benchmarks for each maturity

attribute/IT process combination• Collect IT demographics to perform an initial analysis of process maturity measures vs. IT demographics as a starting point for

benchmarking profiles for different demographic combinations• Provide guidance for enterprises to conduct their own self-assessment to compare their process capability maturity measures with

the benchmarks

Process Maturity Model in CobiT

When adopted into CobiT, CMM required adjustment. As mentioned previously, CMM has its roots in software development. The process of software engineering and development is a challenging but relatively straightforward task compared with the complete range of tasks over the entire life cycle of IT investment. CMM adoption sees logical forward movement from one discrete level to the next discrete level of maturity (e.g., from defined to managed).

While this works well in the relatively confined domain of software development, it does not fit quite so neatly in the CobiT environment, which is designed to manage activities as disparate as strategic planning, service delivery and the management of physical infrastructure. As discussed in the following paragraphs, CobiT provides generic, attribute-specific and process-specific definitions of each of the levels of process maturity.

Figure 8 shows the generic definitions for the maturity levels.

19© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

7 ITGI’s CobiT® Online provides benchmarking data that serve as a useful first cut at maturity levels. Unfortunately, it provides limited ability to group or analyse the maturity data. Given that the data are from anonymous contributors, they are difficult to validate.

Figure 8—Generic Maturity Model (CobiT 4.1)

0 Non-existent

There is a complete lack of any recognisable processes. The enterprise has not even recognised that there is an issue to be addressed.

1 Initial/ad hoc

There is evidence that the enterprise has recognised that the issues exist and need to be addressed. There are, however, no standardised processes; instead there are ad hoc approaches that tend to be applied on an individual or case-by-case basis. The overall approach to management is disorganised.

2 Repeatable but intuitive

Processes have developed to the stage where similar procedures are followed by different people undertaking the same task. There is no formal training or communication of standard procedures, and responsibility is left to the individual. There is a high degree of reliance on the knowledge of individuals and, therefore, errors are likely.

IT Governance and Process Maturity

20 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

CobiT recognises that the level of maturity may differ widely amongst processes within a given enterprise. The path to higher levels of enterprise maturity may also differ across domains and processes.

To more finely define a process’s maturity, instead of having just one maturity level for each process, CobiT subdivides process maturity into the six attributes shown in figure 2.

Each attribute of process maturity is necessary for the development of maturity measures. For each maturity level, CobiT provides carefully developed language that provides a qualitative indication of the corresponding level of maturity for each attribute of process maturity. Figure 9 shows the generic textual description of the levels of maturity for the ‘awareness and communication’ attribute. The complete set of generic textual descriptions for five of the six attributes process maturity is reproduced in Figure 89 in appendix 1.

CobiT provides a second set of process maturity levels. In addition to the set of generic descriptions, CobiT also has a set of statements that are specific for each process and aligned with the levels of the process maturity model. Figure 10 reproduces the maturity model specifically for CobiT process AI6 Manage changes.

Figure 8—Generic Maturity Model (CobiT 4.1) (cont.)

3 Defined process

Procedures have been standardised and documented, and communicated through training. It is mandated that these processes should be followed; however, it is unlikely that deviations will be detected. The procedures themselves are not sophisticated but are the formalisation of existing practices.

4 Managed and measurable

Management monitors and measures compliance with procedures and takes action where processes appear not to be working effectively. Processes are under constant improvement and provide good practice. Automation and tools are used in a limited or fragmented way.

5 Optimised

Processes have been refined to a level of best practice, based on the results of continuous improvement and maturity modelling with other enterprises. IT is used in an integrated way to automate the workflow, providing tools to improve quality and effectiveness, making the enterprise quick to adapt.

Figure 9—Maturity Attributes: Awareness and Communication

Maturity Level Textual Description

1 Initial/ad hoc Recognition of the need for the process is emerging. There is sporadic communication of the issues.

2 Repeatable but intuitive There is awareness of the need to act. Management communicates the overall issues.

3 Defined There is understanding of the need to act. Management is more formal and structured in its communication.

4 Managed and measurable There is understanding of the full requirements. Mature communication techniques are applied and standard communication tools are in use.

5 Optimised There is advanced, forward-looking understanding of requirements.

Figure 10—AI6 Process Maturity Model

Management of the process Manage changes that satisfies the business requirement for IT of responding to business requirements in alignment with the business strategy, whilst reducing solution and service delivery defects and rework is:

0 Non-existent when

There is no defined change management process and changes can be made with virtually no control. There is no awareness that change can be disruptive for IT and business operations, and no awareness of the benefits of good change management.

1 Initial/ad hoc when

It is recognised that changes should be managed and controlled. Practices vary, and it is likely that unauthorised changes take place. There is poor or non-existent documentation of change, and configuration documentation is incomplete and unreliable. Errors are likely to occur together with interruptions to the production environment caused by poor change management.

2 Repeatable but intuitive when

There is an informal change management process in place and most changes follow this approach; however, it is unstructured, rudimentary and prone to error. Configuration documentation accuracy is inconsistent, and only limited planning and impact assessment take place prior to a change.

1. Introduction

21© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Elements of each of the six generic attributes of the AI6 process maturity model are embedded in the various process-level maturity models in CobiT. For example, at level 4 of the process-level maturity model for AI6, the statement ‘An approval process for changes is in place. Change management documentation is current and correct, with changes formally tracked.’ correlates closely with the policies, standards and procedures maturity attribute. However, not all generic maturity attributes are represented at each of the levels of process maturity. Taking level 4 as an example, awareness and communication; policies, standards and procedures; goal setting and measurement are represented, but tools and automation, skills and expertise, and responsibility and accountability are not.

Organisation of the Report

This report is divided into seven chapters plus two appendices. Chapter 2 describes the study method and includes the demographics of the 51 enterprises that took part in the study. Chapter 3 presents the primary research findings. The first part presents the maturity levels for each of the 34 IT processes with the four domains. It then disaggregates the data along the attribute dimensions to show the maturity levels for each of the six attributes for each of the four domains. It includes two levels for each attribute: ‘as is’ and ‘to be’ (the level expected in 12 months). Chapter 4 provides the most disaggregated data, in that it provides the maturity level for each attribute for each process. Up to that point, all of the analysis is based on the 51 enterprises taken as whole. Chapter 5 divides the 51 enterprises along several different demographic variables (e.g., geographic location, industry classification) to determine if the maturity levels vary by different dimensions. Chapter 6 presents a general set of instructions for enterprises to perform a self-assessment against the benchmark data in this report. Chapter 7 includes concluding comments plus recommendations for future research to expand on the findings in this study.

Appendix 1 includes the maturity attribute table and appendix 2 includes a series of tables. The first table includes the average maturity levels by attribute within each process for the 51 enterprises as a group. The other tables include data divided by demographic variables.

Figure 10—AI6 Process Maturity Model (cont.)

3 Defined when

There is a defined formal change management process in place, including categorisation, prioritisation, emergency procedures, change authorisation and release management, and compliance is emerging. Workarounds take place and processes are often bypassed. Errors may still occur and unauthorised changes occasionally occur. The analysis of the impact of IT changes on business operations is becoming formalised, to support planned rollouts of new applications and technologies.

4 Managed and measurable when

The change management process is well developed and consistently followed for all changes, and management is confident that there are minimal exceptions. The process is efficient and effective, but relies on considerable manual procedures and controls to ensure that quality is achieved. All changes are subject to thorough planning and impact assessment to minimise the likelihood of post-production problems. An approval process for changes is in place. Change management documentation is current and correct, with changes formally tracked. Configuration documentation is generally accurate. IT change management planning and implementation are becoming more integrated with changes in the business processes, to ensure that training, organisational changes and business continuity issues are addressed. There is increased co-ordination between IT change management and business process redesign. There is a consistent process for monitoring the quality and performance of the change management process.

5 Optimised when

The change management process is regularly reviewed and updated to stay in line with good practices. The review process reflects the outcome of monitoring. Configuration information is computer-based and provides version control. Tracking of changes is sophisticated and includes tools to detect unauthorised and unlicensed software. IT change management is integrated with business change management to ensure that IT is an enabler in increasing productivity and creating new business opportunities for the enterprise.

IT Governance and Process Maturity

2. Study Method and Data CollectionThe basic concepts underlying the process maturity model are relatively easy to understand. However, it is not straightforward to apply those concepts to a complete range of processes in the ‘real world’ that map to the life cycle of IT investment. A principal objective of this study is to obtain benchmark maturity levels for each of the 34 IT processes in CobiT. It was recognised from the initial stages of research design that research techniques such as surveys or focus groups were inappropriate to collect detailed, quantifiable process maturity benchmark data. Rather, a research method was required that collected data at the level of individual entities. A field study was determined to be the appropriate technique.

The CobiT framework provides a comprehensive set of processes that encompasses the complete IT life cycle. While adoption of CobiT was not a prerequisite for participation in the study, the goal was to ensure that the study traversed the complete range of activities that a typical IT organisation would likely perform. The CobiT framework provides that complete coverage in its four domains and 34 IT processes. Based on prior evaluations of process performance, four CobiT processes were divided into subprocesses because of the complexity and importance of the process (e.g., DS5 Ensure systems security) or because of markedly different concepts embedded within the process (e.g., the data classification and enterprise architecture concepts within PO2 Define the information architecture). As a result, a total of 41 processes was used for the project (figure 1).

Given the use of CobiT as the underpinning for the study, the next important question was how to measure the process maturity. It was determined that the appropriate method was to collect perceptions of maturity levels from practising managers within the enterprises during the field study. A principal advantage of this technique was the ability to cost-effectively visit and collect data at a significant number of enterprises. The managers that were interviewed took the process very seriously. At the enterprises where data were collected from more than one person for a given process, the between-person variation was typically within one level of maturity. This information is, of course, self-reported and subject to bias, and it proved difficult to independently validate the responses by inspection of policies and procedures or through other techniques. However, as will be evident in the following chapters, because of the number of level 0 and level 1 responses to the surveys, the respondents seemed candid in the information they provided.

As discussed in the introduction to this report, CobiT provides two techniques for measuring maturity levels: generic and process-specific. Collecting maturity levels with the generic process maturity attribute table allows for viewing maturity as a matrix: by process (rows) and by attribute (columns). Previous research showed that collecting data using the process-specific maturity descriptions would be significantly more time-consuming than using the generic maturity descriptions. The additional time for data collection might jeopardise the willingness of enterprises to commit to the study. The use of the process-level maturity descriptions is discussed later in this chapter. It is detailed later in this report that managers worked well in this study with the generic process maturity attribute table.

The next task was to find the field study sites, which required co-operation from many enterprises around the world. Data were collected in a number of countries and across a range of enterprises. The primary criterion for selection was the size of the IT function, and data were collected on all 34 processes in CobiT. This includes processes as disparate as application development, security, facilities management, and strategy and risk management. Only IT organisations of a reasonable size are likely to include the complete range of processes. With only a few exceptions, the minimum number of employees in the IT function in the study was 25. There were a couple of very large IT organisations in the study, each with more than 500 staff members within the IT function. Typically, however, the enterprises in the study had 40 to more than 100 staff members in IT. In the final analysis, a total of 51 enterprises spread across eight countries in Europe, North America and Asia were visited.8 Participation in the study allowed the enterprises to view themselves against peers in their industries and countries and in other industries and countries. The study was pitched to CIOs as a ‘win-win’ proposition.

22 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

8 For a dozen sites, data were collected by a senior IT governance professional using the same data collection procedures as those utilised for the rest of the study.

2. Study Method and Data Collection

Method

The following provides an overview of the general method employed for the investigation. Given the tremendous variations amongst enterprises, time pressures and the differing needs of enterprises, there were some variations in data collection.

The research team:1. Identified CIOs who were willing to participate in the project and used many resources to obtain introductions to CIOs.9 The most

common source was local ISACA chapters. Ernst & Young provided several leads in the Los Angeles area. Personal contacts from the CobiT Steering Committee also proved useful.

2. Sent an introductory e-mail message to each CIO, explaining the project. The e-mail typically had three attachments: a letter from the director of research at ISACA, that provided a one-page introduction to the project; a two-page set of project frequently asked questions (FAQs); and a spreadsheet that listed and described the 41 IT processes.10 It was stressed to the CIOs that data for their enterprise would be treated confidentially and no specific enterprise name would be associated with any specific information in any report. When requested, the researchers signed non-disclosure agreements. Whilst CobiT provides the foundation of IT processes for the study, active use of CobiT was not a requirement for the study.

As compensation for helping with the project, the CIOs recieved: – An e-mail message, sent within two days, containing a spreadsheet with all the data collected from their enterprise11

– A second spreadsheet via e-mail after all the data collection was completed that showed their process maturity levels next to the aggregated process maturity levels from all the enterprises in the study

– A copy of this report Using the spreadsheet, the CIO listed the managers responsible for each process. Since any one manager might have from

three to ten processes for which he/she was responsible, the list of 41 processes would generally have a total of three to five different names—frequently including the CIO as one of the people responsible for the process. Conversely, some processes had multiple owners.

3. Scheduled a convenient day(s) when the researchers could interview the managers and the CIO. Depending on the number of processes associated with each manager, each interview usually required 45 to 60 minutes (with some exceptions that required up to a couple of hours). In a limited number of cases, interviews were conducted in a group setting with several managers.

Collecting Process Maturity Levels

At the start of each interview, the researcher provided a brief description of the project and the protocol for the interview. The interviewee was given a copy of the generic process maturity attribute table. The columns were the six attributes and the rows were the scale from 1 to 5. Each cell included a brief description of the characteristics for each level for a specific attribute (see discussion above and figure 89 in appendix 1).

Taking one process at a time, the interviewer introduced the process control objectives and summarised the detailed control objective (from the CobiT manual) for the interviewee. The interviewee read the attribute descriptions from the process maturity attribute and then correlated the process maturity level for each of the six attributes with the state of their particular enterprise’s maturity for that process. They would then state, out loud, the maturity level for each of the six attributes. The researcher recorded the interviewee’s responses.

The interviewee stated two numbers for each attribute. The first number represented the current level of maturity ‘as is’ and the second number was the expected level of maturity one year in the future ‘to be’. Respondents were allowed to select a level in between the given discrete levels of process maturity (e.g., 1.6, 2.4) to reflect that they had not yet achieved every aspect of the next level. The survey found that respondents could normally see that they had reached a particular level, but had not necessarily achieved the next discrete level. Zero was also allowed as a response, but with a slightly different definition than in the SEI’s CMM. In CMM,

23© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

9 Throughout this report, ‘CIO’ is used as a generic term to refer to the primary, top-level contact at each enterprise. In some enterprises, the top-level person did not have the CIO title. Instead, they might have a title similar to senior vice president of IT. Also, in very large enterprises, there might be a hierarchy of CIOs, with a corporate CIO at the top and individual division CIOs (or equivalents) reporting to the corporate CIO. In those situations, the focus was on the division CIOs because the corporate CIOs were typically too far removed from the day-to-day IT processes to provide the detailed process maturity data needed.

10 Based on feedback early in discussions with IT professionals, some of the 34 IT processes in CobiT were disaggregated, resulting in a total of 41 processes. This is discussed further in chapter 4.

11 Several CIOs saw that data as a starting point to conduct their own annual assessments of their process maturities.

IT Governance and Process Maturity

24 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

level 0 means non-existent—the process does not exist at all. Some of the subjects in this study wanted to use 0 to indicate that the maturity of a particular process was still significantly lower than one—but that the process did exist. As such, going forward in this report, 0 is the indicated maturity level measure, not an indicator that the process did not exist at all.

Regarding the ‘to be’ data, the point was stressed to the interviewee not to assign a higher maturity level just because it was hoped that maturity would improve in the future. Instead, the point was to recognise that the maturity level might be expected to change in the future only if the IT function had a specific initiative in process and funding was obtained.

The interviewers also sought additional information and challenged the manager on a subset of the processes. When the manager was being interviewed on several processes, it was particularly important to ensure that the maturity levels were measured correctly early in the data collection cycle. Examples of validation questions included, ‘How is management’s awareness of this process communicated to the IT organisation for this process?’ and ‘What are some of the tools and technologies supporting this process?’ An established pattern was then extended to the other processes for the particular interviewee. This registration and validation process had to be handled judiciously because data collection could take an unacceptable amount of time if every data point were challenged.

With some exceptions, respondents understood these attributes and the descriptions of the attribute levels. For some respondents, it took a little time to recognise that the process maturity statements were to be applied only to that particular process. For example, while metrics, tools or techniques might exist for one process, they might not necessarily apply to another process. Further, some managers could not initially recognise the concept of goals that related to a particular process. There were more difficulties with the ‘goal setting and measurement’ attribute than with the others. Managers often would grasp on to a variety of broad goals rather than process goals. Similarly, managers would refer to generic performance measures or metrics that related to other processes rather than those that measured performance for the given process under discussion. Each of these issues was resolved through discussion, use of appropriate analogies and additional questioning of the respondent.

Collecting IT Governance and Demographic Data

A separate questionnaire was used to interview the CIO to collect IT governance and demographic information for the enterprise. A wide variety of issues were investigated that have previously been identified as relevant to the study of IT governance. These included:• The nature and extent of strategic and tactical alignment between IT and the rest of the enterprise (business/IT alignment)• The structure of the IT function, for example, centralised, decentralised and so-called federal modes • The alignment of IT governance with enterprise-wide governance• The characteristics and role of monitoring over IT• The adoption of IT governance processes and frameworks• The breadth and depth of outsourcing

Demographic data were also collected on aspects such as size, industry classification and spending.

Study Sample

As figure 11 shows, 51 enterprises were included in the study, with representation from North America, Asia and Europe.

Figure 11—Study Sites by Geographic Location

Location Frequency Percent

Austria/Germany/Switzerland 14 27%

Canada 3 6%

Mexico 4 8%

Philippines 8 16%

Singapore 4 8%

USA 18 35%

Total 51 100%

2. Study Method and Data Collection

25© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

A prerequisite for participating in the study was that the enterprise performed most of the 41 processes encompassed in the study. As a result, the enterprises in the study were relatively large in size. Figure 12 shows three scope (size) measures for the participating enterprises. The largest enterprise had the equivalent of 690 full-time IT employees and the average number of IT staff was 172.

Even though the 51 enterprises had significant IT operations, the complete set of the 41 IT processes did not exist at all of these enterprises. Figure 13 shows the number of processes captured from the 51 enterprises. As the table shows, 81 percent had 38 or more of the processes. For the few enterprises that had fewer than 38 processes, the reasons were either that those processes did not apply to them or they had no controls or management practices in place for those processes.

As figure 14 illustrates, most enterprises in the study have mixed IT environments, with 98 percent using Wintel servers and 94 percent using UNIX. Mainframes were also used to varying degrees by a large minority (34 percent) of the sample enterprises.

Figure 12—Scope of IT Operations

Number Mean Median Maximum

IT staff members 172 120 690

Application systems 82 50 280

Clients (or workstations) 3,132 2,050 15,000

Figure 13—Number of Processes Captured (n=51)

Number of Processes* Percent of Maximum Number of Enterprises Percent of Population

14 34% 1 2%

23 56% 1 2%

26 63% 1 2%

32 78% 1 2%

34 83% 1 2%

36 88% 2 4%

37 90% 3 6%

38 93% 5 10%

39 95% 6 12%

40 98% 10 20%

41 100% 20 39%

*38 = Average number of processes

Figure 14—Hardware Used

Hardware Not Used Some Usage Important But Not Central Core Technology

Mainframe 66% 2% 8% 24%

UNIX/Linux 6% 16% 16% 62%

AS/400 80% 8% 2% 10%

Wintel servers 2% 12% 30% 56%

Other 92% 0% 4% 4%

IT Governance and Process Maturity

26 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance

As shown in figure 15, centralised IT governance was, by far, the most common structure for the enterprises in this study (74 percent). Only 6 percent indicated a decentralised structure. Twenty percent of respondents use a federal structure, which combines centralised operations activities and decentralised application activities.

The survey also sought to determine which IT governance and IT management frameworks were used by the enterprises. Figure 16 shows that both CobiT and ITIL were frequently used in the enterprises studied. Of course, there is likely to have been a self-selection bias as many enterprises that are users of CobiT or CobiT and ITIL would be more likely to volunteer to participate in the study. Interestingly, at 16 percent and 10 percent respectively, very few of the enterprises in the study thoroughly followed either CobiT or ITIL. Only 6 percent (one mid-sized manufacturer and two large financial-sector institutions) said that they thoroughly follow both CobiT and ITIL.

Level of Outsourcing

The majority of the enterprises in the study outsourced some aspects of their IT activities. Specifically, 76 percent and 65 percent of respondents outsourced some aspects of their software or hardware functions, respectively. Respondents outsourced an average of 3.7 of the nine areas of software development and services listed in figure 17 (the maximum software areas reported by any enterprise was 8) and 2.4 areas of the eight hardware areas (the maximum hardware areas reported by any enterprise was 8).

Figure 15—Organisation of the IT Function

Governance Over IT Percent

Centralised

• With corporate management primarily in control 20%

• With IT management primarily in control 54%

Total 74%

Decentralised

• With corporate management primarily in control 2%

• With IT management primarily in control 4%

Total 6%

Federal 20%

TOTAL 100%

Figure 16—IT Governance Frameworks

Frameworks Not Used Influences Own Standards Partially Followed Thoroughly Followed

CobiT 8% 32% 44% 16%

ITIL® 14% 30% 46% 10%

ISO 17799 and ISO 27001 36% 24% 32% 8%

CMM and CMMI 52% 32% 14% 2%

PRINCE2 84% 14% 2% 0%

PMBOKTM 48% 22% 22% 8%

2. Study Method and Data Collection

27© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 17—Software, Services and Hardware Outsourced

Category N/A Some Usage Important But Not Central Central to the Mission

Software and Services

Software development 32% 22% 18% 28%

Program/user documentation 44% 22% 18% 16%

Quality assurance 62% 20% 8% 10%

Other application development 92% 0% 0% 8%

Operations 52% 18% 8% 22%

Help desk 50% 12% 16% 22%

Security 52% 26% 8% 14%

Telecommunications 52% 20% 16% 12%

Other service/delivery 86% 0% 4% 10%

Hardware

Mainframe 92% 2% 0% 6%

Servers 61% 16% 6% 16%

Desktops 72% 6% 10% 12%

Backup 48% 10% 14% 28%

Network 60% 18% 4% 18%

Network management 64% 20% 4% 12%

Telecommunications 60% 16% 10% 14%

Other hardware 96% 2% 0% 2%

IT Governance and Process Maturity

28 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

3. Research Findings

Overall Results

For each enterprise, a total of 492 data points were collected that represent both ‘as is’ and ‘to be’ points in time for six attributes for 41 processes (2 @ 6 @ 41). When calculating the overall maturity level for one process, a simple average was taken of the six attributes associated with that process. If more than one manager was interviewed for a given process, a simple average was taken of their responses. For a small number of enterprises, only overall process maturity level data were collected.

Figures 18 through 26 illustrate the process maturity levels in a series of box plots. These box plots depict numeric data through the box plot’s five-number summaries: the lowest observation, lower quartile (Q1), median, upper quartile (Q3) and highest observation. The interquartile range is calculated by subtracting the first quartile (Q1) from the third quartile (Q3). The end points of the whiskers are 1.5 IQR lower than Q1 and 1.5 IQR higher than Q3. Any responses outside of those 1.5 IQR limits are outliers that are shown as dots on the plots. The IQR (the length) of the box can be considered a measure of the consensus amongst the respondents. A relatively long box indicates low consensus (a wider distribution of responses) and a relatively short box indicates high consensus (a narrower distribution of responses).

The Big Picture

Figure 18 (identical to figure 3, but duplicated here for convenience) illustrates highly aggregated data for the four process domains. Four points are evident in the figure. First, the subjects indicated that the maturity levels were expected to increase by approximately 0.5 in the next 12 months. Second, the Monitor and Evaluate (ME) domain had the lowest maturity levels. In addition, because its box is the longest (Q3 – Q1), the ME domain also had the lowest consensus of all the domains. The ‘as is’ responses for Deliver and Support (DS) domain had the widest range of responses, extending from 0 to almost 5, which illustrates the third and fourth points: even large, mature enterprises can be at level 0, and level 5 is achievable. These four points will be expanded on with more disaggregated data in the following discussions.

5 Pr

oces

s M

atur

ity

4 3

2 1

0

As Is To Be As Is To Be As Is To Be As Is To Be

PO AI DS ME

Figure 18—Overall Process Maturity by Domain

3. Research Findings

29© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Process-level Maturity With Each Domain

Figures 19 through 22 (also shown in figure 4, but duplicated here for convenience) present the data disaggregated by the 34 processes included in the four domains.12 Figure 19 includes the 10 processes in the Plan and Organise (PO) domain. As shown in the figure, the median maturity levels ranged from approximately 1.7 to 2.7, with PO7 Manage IT human resources having the highest median and PO2 Define the information architecture and PO8 Manage quality having the lowest median. An interesting aspect of the figure is that the low-end points of the whiskers are lower than level 1 for nine of the 10 processes. This illustrates two points: The maturity of processes can be very low, even in large, mature IT organisations; and the subjects were candid in their answers if they were willing to give the researchers such low numeric answers.

While there are many processes that have some interaction with corporate processes and controls outside the IT function, there are several processes in CobiT that are particularly closely tied to generic corporate processes. For example, the process PO5 Manage the IT investment links with finance and accounting processes within the enterprise, at least with respect to the budgeting and expenditure control components of the process. PO7 Manage IT human resources links with corporate human resources processes. The tight couplings between these IT processes and processes outside of IT could explain their relatively high maturity.

Conversely, the processes in this domain showing the lowest maturity are particularly challenging for IT organisations. PO2 Define the information architecture and PO8 Manage quality both require systemic change and significant human resources and monetary investment to achieve higher levels of maturity. These barriers may help to explain the relatively low levels of maturity. These processes are discussed further in chapter 4.

Finally, in this discussion of the PO domain, the results for PO10 Manage projects are notably high—with one important caveat. It was clear that significant investment has gone into the process of project management in many IT organisations, resulting in higher levels of maturity. However, PO10 is the average of Manage projects—Programme managment (PO10PG) and Manage projects—Project management (PO10PJ). PO10PG has a very different and much lower level of maturity than PO10PJ. This shows that there is still much to be done in the management of IT investment programmes.

12 As mentioned earlier, data were collected at the attribute level. The process-level results in this section are the averages of the applicable attribute levels. Also, 41 processes were used by subdividing some of the 34 attributes in CobiT. In this section, those 41 processes are averaged into the appropriate 34 processes.

0 1 2 3 4 5 Process Maturity

Note: ‘As Is’ Process Maturity

PO1

PO2

PO3

PO4

PO5

PO6

PO7

PO8

PO9

PO10

Figure 19—Process Maturity for the Plan and Organise (PO) Domain

IT Governance and Process Maturity

Figure 20 shows the seven processes included in the Acquire and Implement (AI) domain. As shown in the figure, the median maturity levels ranged from approximately 2.4 to 3.0, with AI5 Procure IT resources having the highest median level of process maturity and AI4 Enable operation and use the lowest. The maturity level for AI5 is not surprising, given the maturity of corporate acquisition and procurement processes with which the IT function interacts.

Conversely, the result for AI4 is particularly interesting. The whisker extends all the way to the zero level, indicating that some interviewees believed that their IT organisation had not even achieved the most basic level 1 maturity. As indicated previously, this was more than a little surprising considering the size (and general maturity) of the IT organisations included in this study.

Figure 21 includes the 13 processes included in the Deliver and Support (DS) domain. As shown in the figure, the median maturity levels ranged from approximately 2.0 to 3.2, with DS12 Manage the physical environment having the highest median and DS1 Define and manage service levels having the lowest. Both DS2 Manage third-party services and DS4 Ensure continuous service have whiskers extending all the way to zero.

On the other hand, DS6 Identify and allocate costs, DS7 Educate and train users, and DS12 Manage the physical environment have whiskers that extend almost to 5, indicating that they have attained the highest level of maturity at some organisations. This raises an interesting side question: What does it mean that nobody reported a level 5 for some processes? Does that mean that a 5 cannot be achieved for those processes? Or does it really mean that it is achievable, but nobody has pushed to that level because the perceived benefits of achieving level 5 do not justify the costs? These important questions should be addressed in future research.

It seems clear that many IT organisations have worked hard over the years to manage numerous day-to-day operational activities. The confidence with which managers addressed the management and controls in some processes such as DS11 Manage data, DS12 Manage the physical environment and DS13 Manage operations is impressive. These seem to be processes that have received a great deal of attention over many years. Similarly, there was considerable evidence that much investment has gone into each of the aspects of security. As can be seen in figure 21, while divergence was relatively low, there were outliers at either end of the distribution.

30 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

0 1 2 3 4 5 Process Maturity

Note: ‘As Is’ Process Maturity

AI1

AI2

AI3

AI4

AI5

AI6

AI7

Figure 20—Process Maturity for the Acquire and Implement (AI) Domain

3. Research Findings

Figure 22 includes the four processes in the Monitor and Evaluate (ME) domain. As shown in the figure, the median maturity levels were relatively low and close. The perceived maturity levels ranged from approximately 2.0 to 2.2. ME1 Monitor and evaluate IT performance had the highest median and the other processes were clustered near 2.0. As can be seen in the figure, the processes in the ME domain had the widest distribution (lowest consensus). For ME2 Monitor and evaluate internal control and ME3 Ensure compliance with external requirements, the whiskers extend from 0 to nearly 5. As discussed in more detail in chapter 4, many enterprises had considerable difficulty in setting up systematic and formal monitoring processes.

31© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Process Maturity Note: ‘As Is’ Process Maturity

0 1 2 3 4 5

DS1

DS2

DS3

DS4

DS5

DS6

DS7

DS8

DS9

DS10

DS11

DS12

DS13

Process Maturity Note: ‘As Is’ Process Maturity

0 1 2 3 4 5

ME1

ME2

ME3

ME4

Figure 21—Process Maturity for the Deliver and Support (DS) Domain

Figure 22—Process Maturity for the Monitor and Evaluate (ME) Domain

IT Governance and Process Maturity

Disaggregated Attribute Data by Process Maturity Domain

Whereas figures 19 through 22 in the previous section disaggregated the data by specific processes within each domain, figures 23 through 26 disaggregate the domain data by the six maturity attributes.

Figure 23 depicts maturity levels for the Plan and Organise (PO) domain. The medians ranged from approximately 2 to 3, with the highest being awareness and the lowest being technology and goals. For three attributes, the whiskers extended from 0 to 5, illustrating the wide distribution of responses for those attributes. Keeping in mind that the length of the box indicates the level of consensus, skills and responsibility have the highest consensus and technology and goals have the lowest consensus.

In terms of the ‘to be’ responses vs. ‘as is’ responses, the ‘to be’ medians were a little higher and the consensus levels were also greater, which was primarily due to those who gave low ‘as is’ values and then increased their ‘to be’ values.

Figure 24 features the Acquire and Implement (AI) domain. The medians ranged from approximately 2 to 3, with the highest being awareness, policies, and responsibilities, and the lowest being technology. For three attributes, the whiskers extended from 0 to 5, illustrating the wide distribution of responses. Skills had the highest consensus and policies, technology, and goals had the lowest consensus. In terms of the ‘to be’ responses vs. ‘as is’ responses, the ‘to be’ medians and consensus were slightly higher.

Figure 25 shows the Deliver and Support (DS) domain. The medians ranged from approximately 2 to 3, with the highest being awareness and responsibilities, and the lowest being goals. For four attributes, the whiskers extended from 0 to 5. Skills had the highest consensus and technology had the lowest. The ‘to be’ medians and consensus were a little higher than the ‘as is’ responses.

32 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Proc

ess

Mat

urity

5

4 3

2 1

0

As Is To Be

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

5

4 3

2 1

0

As Is To Be

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 23—Process Attributes for the Plan and Organise (PO) Domain

Figure 24—Process Attributes for the Acquire and Implement (AI) Domain

Figure 26 features the Monitor and Evaluate (ME) domain. The medians ranged from approximately 1.5 to 2.8, with the highest being awareness and the lowest being technology. For four attributes, the whiskers extended from 0 to 5, illustrating the wide distribution of responses. Responsibility had the highest consensus and goals had the lowest consensus. In terms of the ‘to be’ responses vs. the ‘as is’ responses, the ‘to be’ medians and consensus were a little higher.

Figure 27 illustrates the relative maturity values for the four domains and six attributes. Red indicates the highest level of maturity, pink indicates a medium level and white indicates the lowest level. As the figure demonstrates, awareness was the highest-ranked (or tied for highest) attribute for every domain. Responsibility came in a close second by being tied for the highest attribute for three domains and medium for one domain. At the other end, goals was the lowest-ranked (or tied for lowest) attribute for all domains, and the technology and tools attribute was the lowest-ranked (or tied for lowest) attribute for three domains. The policies and procedures attribute varied widely, coming in high for one domain, medium for two domains and lowest for one domain.

The next chapter takes the most detailed look at the data by disaggregating the processes within the domains and the six attributes within each process.

3. Research Findings

33© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Proc

ess

Mat

urity

5

4 3

2 1

0

As Is To Be

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

54

32

10

As Is To Be

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 25—Process Attributes for the Deliver and Support (DS) Domain

Figure 26—Process Attributes for the Monitor and Evaluate (ME) Domain

Figure 27—Relative Maturity Level for Each Domain and Attribute

Domains

Attributes

Awareness Policies Technology Skills Responsibility Goals

PO

AI

DS

ME

IT Governance and Process Maturity

34 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

4. Process Maturity by Domain and ProcessThis section includes the most detailed (disaggregated) data for the six attributes within the 41 processes for the ‘as is’ data. The discussion in this chapter is organised by the 41 processes. Each process discussion starts with the control objective for the process and lists the detailed control objectives extracted from CobiT. Then there is a box plot, accompanied by commentary, that shows the overall results and the ‘as is’ results for the six attributes.

Plan and Organise Domain

PO1 Define a Strategic IT PlanIT strategic planning is required to manage and direct all IT resources in line with the business strategy and priorities. The IT function and business stakeholders are responsible for ensuring that optimal value is realised from project and service portfolios. The strategic plan improves key stakeholders’ understanding of IT opportunities and limitations, assesses current performance, identifies capacity and human resource requirements, and clarifies the level of investment required. The business strategy and priorities are to be reflected in portfolios and executed by the IT tactical plan(s), which specifies concise objectives, action plans and tasks that are understood and accepted by both business and IT.

The control objectives within the process are: • 1.1 IT value management • 1.2 Business-IT alignment• 1.3 Assessment of current capability and performance• 1.4 IT strategic plan• 1.5 IT tactical plans• 1.6 IT portfolio management

As figure 28 illustrates, the awareness attribute had the highest median (approximately 3) and the top of its whisker nearly reached 5. Technology had the lowest median (approximately 1.5). Because its box is the shortest, technology also had the least distribution (highest consensus) of responses, with many respondents stating numbers between 1 and 2. Based on the bottom of its whisker, technology was the only attribute for which respondents indicated that the level in their organisation was 0. There was a great deal of variation in process maturity on several of the dimensions in the process. In many cases, respondents reported considerable progress with systematic development of the IT strategic plan (PO1.4). The level of maturity in business/IT alignment (PO1.2) ranged from high to low. Some respondents had systematic approaches to engaging the organisation in the development of medium-term and short-term plans. Others found it very difficult to engage the enterprise in any aspect of strategic or tactical planning. A consistent theme in the discussions was that IT value management (PO1.1) and IT portfolio management (PO1.6) were at low levels of process maturity.

dfdsvfzdvdz

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 28—Process Maturity for PO1 Define a Strategic IT Plan

4. Process Maturity by Domain and Process

35© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

While managerial awareness and communication are perceived as relatively high, the other attributes are markedly lower in relation to other processes in the domain. There is one level of maturity difference (level 3 vs. level 2) between awareness and communication, and polices, standards and procedures. Several enterprises reported relatively low levels of formal policy setting over the managerial processes embedded in the various detailed control objectives. As might be expected for this process, the level of maturity for tools and automation (technology) is relatively low. Many enterprises rely on office productivity tools for their strategic planning. There were a couple of outliers that make very sophisticated use of tools, including intranet and database technologies, to manage aspects of the planning process.

PO2 Define the Information ArchitectureThe information systems function creates and regularly updates a business information model and defines the appropriate systems to optimise the use of this information. This encompasses the development of a corporate data dictionary with the organisation’s data syntax rules, data classification scheme and security levels. This process improves the quality of management decision making by making sure that reliable and secure information is provided, and it enables rationalising information systems resources to appropriately match business strategies. This IT process is also needed to increase accountability for the integrity and security of data and to enhance the effectiveness and control of sharing information across applications and entities.

The control objectives within the process are:• 2.1 Enterprise information architecture model• 2.2 Enterprise data dictionary and data syntax rules• 2.3 Data classification scheme• 2.4 Integrity management

There are two distinct aspects of PO2 used to divide this process into two separate processes: information architecture and data architecture. The first of these subprocesses incorporated the detailed control objectives PO2.1 and PO2.2. The second process incorporated PO2.3 and aspects of the final detailed control objective PO2.4. As discussed in the previous chapter, the overall maturity on PO2 is relatively low.

Figure 29 illustrates the information architecture (PO2A) results. The awareness attribute had the highest median (approximately 2.7) and goals had the lowest (approximately 1). The median coincides with the bottom of the box, which indicates that 50 percent of the respondents indicated 1 as their answer. Three of the attributes extend to 5, three attributes extend to 0, and responsibility extends from 0 to 5.

As with PO1, wide and surprising variation in the self-perceived maturity on this process was observed. There were one or two enterprises that live and breathe their information architecture. They have been working systematically on the architecture over many years. These enterprises use well-established methodologies for the construction, mapping, tracking and maintenance of the information architecture. Their strategic and tactical development plans are built around the architecture. However, these enterprises are very much the exception to a broader rule. For other enterprises, a wide variety of approaches were taken. In some enterprises, information architectures are well established and managed across technology lines (e.g., mainframe vs. client-server Windows and UNIX environments) or application lines. Many enterprises were working on information architectures, had staffed information architecture units and acquired appropriate tools. In many cases, however, these efforts were at a relatively early stage in technology and application coverage. They were at an even earlier stage in embedding these tools into day-to-day decision making within the enterprise.

Compared to figure 29, the medians in figure 30 for the six attributes for data classification (PO2D) are closer together, ranging from 1.5 to 2. Again, these are relatively low levels of maturity and, surprisingly, are even lower than for data architecture. As can be seen from the whiskers on each of the attributes, with the exception of skills, the whiskers extend all the way to 0 and 0.5. This indicates that, for many of the enterprises surveyed, this process was yet to be even marginally managed. It was clear that while many of the enterprises had managed access considerations in a mature fashion (which will be discussed in DS5 Ensure systems security), data classification was much less mature. The results also did not indicate the same level of investment that characterises the management of enterprise architecture.

Figure 31 illustrates the overall process maturity for PO2.

IT Governance and Process Maturity

36 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 30—Process Maturity for PO2D Define the Information Architecture—Data classification

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

5

4 3

2 1

0

Figure 29—Process Maturity for PO2A Define the Information Architecture—Architecture

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 31—Process Maturity for PO2 Define the Information Architecture Overall

PO3 Determine Technological DirectionThe information services function determines the technology direction to support the business. This requires the creation of a technological infrastructure plan and an architecture board that sets and manages clear and realistic expectations of what technology can offer in terms of products, services and delivery mechanisms. The plan is regularly updated and encompasses aspects such as systems architecture, technological direction, acquisitions plans, standards, migration strategies and contingency. This enables timely responses to changes in the competitive environment, economies of scale for information systems staffing and investments, as well as improved interoperability of platforms and applications.

The control objectives within the process are:• 3.1 Technological direction planning• 3.2 Technological infrastructure plan• 3.3 Monitor future trends and regulations• 3.4 Technology standards• 3.5 IT architecture board

4. Process Maturity by Domain and Process

37© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

As illustrated in figure 32, awareness had the highest median (approximately 3) and technology had the lowest median (approximately 1). For technology, the median and the bottom of the box coincide at 1, indicating that 50 percent of the respondents selected 1 as their response. As seen in figure 32, there is tremendous diversity in maturity of this process. Some enterprises can tick many of the ‘boxes’ in this process, including PO3.2 Technology infrastructure plan, PO3.4 Technology standards and PO3.5 IT architecture board, but by no means all of the boxes. As discussed in more detail in the next chapter, an important element of this variation relates to size. Larger enterprises tend to have more sophisticated approaches to technology forecasting and planning. At the same time, many of the enterprises reported difficulties in engaging management in technology planning and forecasting and building agreed-upon sustainable approaches to technology migration.

PO4 Define the IT Processes, Organisation and RelationshipsAn IT organisation is defined by considering requirements for staff, skills, functions, accountability, authority, roles and responsibilities, and supervision. This organisation is embedded into an IT process framework that ensures transparency and control as well as the involvement of senior executives and business management. A strategy committee ensures board oversight of IT, and one or more steering committees in which business and IT participate should determine the prioritisation of IT resources in line with business needs. Processes, administrative policies and procedures are in place for all functions, with specific attention to control, quality assurance, risk management, information security, data and systems ownership, and segregation of duties. To ensure timely support of business requirements, IT is to be involved in relevant decision processes. The control objectives within the process are:• 4.1 IT process framework• 4.2 IT strategy committee• 4.3 IT steering committee• 4.4 Organisational placement of the IT function• 4.5 IT organisational structure• 4.6 Establishment of roles and responsibilities• 4.7 Responsibility for IT quality assurance• 4.8 Responsibility for risk, security and compliance• 4.9 Data and system ownership• 4.10 Supervision• 4.11 Segregation of duties• 4.12 IT staffing• 4.13 Key IT personnel• 4.14 Contracted staff policies and procedures• 4.15 Relationships

PO4 is an important process for setting the policy and process framework for the complete IT function. Given the wide range of management tasks embedded in this process, it was divided into two components: organisation (designated as PO4O) and processes (designated as PO4P). The first of these subprocesses covered PO4.4, PO4.5, PO4.6 and PO4.12. The second subprocess covered control objectives PO4.1, PO4.2, PO4.3, PO4.6 and PO4.7.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 32—Process Maturity for PO3 Determine Technological Direction

IT Governance and Process Maturity

38 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 33 shows the level of process maturity for PO4O Define the IT processes, organisation and relationships—Organisation. It indicates a relatively high level of maturity for managerial awareness and roles and responsibility (approximately 3). This is in contrast to the relatively low level of maturity for each of the other attributes of process maturity.

Many of the enterprises in the study either were not using IT management frameworks, such as CobiT or ITIL, or were at early stages of adoption of these frameworks. Each of these frameworks requires systematically identifying process ownership, albeit with a smaller coverage for ITIL, and setting up more formal enterprise structures and identification of roles and responsibilities. Once again, the level of maturity for goals and metrics was low. Interestingly, however, there is a great deal of diversity with the whiskers, ranging from 0 almost up to a level of 5 for some enterprises.

Figure 34 shows the level of process maturity for PO4P Define the IT processes, organisation and relationships—Processes. It indicates a relatively high level of maturity for managerial awareness (approximately 3) and roles and responsibility (approximately 2.5). However, translating this into formalised policies and procedures (approximately 2) or designing and rolling out tools and supporting technologies (approximately 1) was a rather different situation. Several enterprises reported that their support for PO4.1 IT process framework was at a relatively low level. Many of the aspects of PO4 are related to other processes, in that they are enablers. For example, taking the managerial actions envisaged as part of PO4.1 will give rise to overall increases in maturity across most other processes.

Some outliers in the management of this process have managed to construct a systematic environment to roll out and communicate policies and procedures. In other words, higher levels of maturity for tools and technologies are often coupled with the maturity of policies and procedures. Those who had higher levels of maturity were often using well-designed and thoroughly communicated intranet solutions. Some of these solutions were using the intranet-based policy and procedure environment as a front end for approvals against particular procedures.

PO5 Manage the IT InvestmentA framework is established and maintained to manage IT-enabled investment programmes and that encompasses cost, benefits, prioritisation within budget, a formal budgeting process and management against the budget. Stakeholders are consulted to identify and control the total costs and benefits within the context of the IT strategic and tactical plans, and initiate corrective action where needed. The process fosters partnership between IT and business stakeholders; enables the effective and efficient use of IT resources; and provides transparency and accountability into the total cost of ownership, the realisation of business benefits and the return on investment of IT-enabled investments.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 33—Process Maturity for PO4O Define the IT Processes, Organisation and Relationships—Organisation

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 34—Process Maturity for PO4P Define the IT Processes, Organisation and Relationships—Processes

4. Process Maturity by Domain and Process

39© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

The control objectives within the process are:• 5.1 Financial management framework• 5.2 Prioritisation within IT budget• 5.3 IT budgeting• 5.4 Cost management• 5.5 Benefit management

There are two quite distinct aspects of PO5, and data on each of these elements have been collected. The first covers PO5.1 through PO5.4. This process has been designated as PO5B Manage the IT investment—Budgeting. This process encapsulates the task of setting up management processes to build an appropriate budget and have it approved and then track and evaluate the expenditure against that budget. The second process covers only PO5.5 Benefit management. This process has been designated as PO5V Manage the IT investment—Value management. As Figures 35 and 36 illustrate, breaking PO5 into these two processes is appropriate. The overall process maturity of PO5B is more than one maturity level higher than it is for PO5V.

Many enterprises report that the key elements of PO5B are under thorough control. By no means, however, is this a universal picture. As can be seen from Figure 35, each of the maturity attributes, with the exception of management’s awareness and communication, range from maturity level 1 to level 5. The overall maturity level is high, but so is the dispersion. Maturity with this process is driven in part by the maturity of enterprise-wide budgeting and reporting processes and in part by the way budgeting and monitoring are managed within the IT function. Higher levels of maturity seem to require solid support from within the IT function for mature processes at the organisational level.

The picture is very different when reviewing PO5V Manage the IT investment—Value management. The overall maturity is significantly lower, and a great deal of variation in the various process attribute maturities can be seen. There are a couple of interesting patterns in the data. Unusually, the maturity level of the responsibility attribute is generally higher than it is for policies and procedures for the process. The average level of maturity for skills is low and there is little variation on this particular dimension, although there are outliers that have what they perceived as appropriate staffing support and skill levels. These results reflect themes that came across in the interactions with managers. A number of enterprises have very little support for this process. Managers recognise the need for action on value management, but are finding it hard to make forward progress. Managers reported difficulty in finding staff members with the right mix of skills, including technology, business knowledge and analytical skills. Managers also noted challenges in interacting with organisation management in fulfilling this aspect of PO5.

Proc

ess

Mat

urity

54

32

1

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 35—Process Maturity for PO5B Manage the IT Investment—Budgeting

Figure 36—Process Maturity for PO5V Manage the IT Investment—Value Management

IT Governance and Process Maturity

40 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

PO6 Communicate Management Aims and DirectionManagement develops an enterprise IT control framework and defines and communicates policies. An ongoing communication programme is implemented to articulate the mission, service objectives, policies and procedures, etc., approved and supported by management. The communication supports achievement of IT objectives and ensures awareness and understanding of business and IT risks, objectives and direction. The process ensures compliance with relevant laws and regulations.

The control objectives within the process are:• 6.1 IT policy and control environment• 6.2 Enterprise IT risk and internal control framework• 6.3 IT policies management• 6.4 Policy, standard and procedures rollout• 6.5 Communication of IT objectives and direction

As figure 37 illustrates, this process seems reasonably well under control. As with PO4, several enterprises have systematic approaches to the management and communication of policies and procedures. For others, however, management of policies, internal control frameworks and other aspects within this process is more haphazard. Of the various processes in the PO domain, PO6 is one that does not seem to fit neatly within IT organisations. It is also not necessarily a process that rests easily within the enterprise. There is a more natural fit in IT organisations with a formal IT governance function, with correspondingly high levels of maturity.

PO7 Manage IT Human ResourcesA competent workforce is acquired and maintained for the creation and delivery of IT services to the business. This is achieved by following defined and agreed-upon practices supporting recruiting, training, evaluating performance, promoting and terminating. This process is critical, as people are important assets, and governance and the internal control environment are heavily dependent on the motivation and competence of personnel.

The control objectives within the process are:• 7.1 Personnel recruitment and retention• 7.2 Personnel competencies• 7.3 Staffing of roles• 7.4 Personnel training• 7.5 Dependence upon individuals• 7.6 Personnel clearance procedures• 7.7 Employee job performance evaluation• 7.8 Job change and termination

As figure 38 illustrates, this process is seen to be relatively well under control. Each of the attributes shows average maturities in the high 2s or low 3s. There is relatively little dispersion, although there are outliers, particularly at the higher reaches of maturity. As previously mentioned, this process has a high correlation with enterprise-wide human resources policies and programmes.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 37—Process Maturity for PO6 Communicate Management Aims and Direction

4. Process Maturity by Domain and Process

41© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

PO8 Manage QualityA quality management system is developed and maintained that includes proven development and acquisition processes and standards. This is enabled by planning, implementing and maintaining the quality management system by providing clear quality requirements, procedures and policies. Quality requirements should be stated and communicated in quantifiable and achievable indicators. Continuous improvement is achieved by ongoing monitoring, analysis and acting upon deviations, and communicating results to stakeholders. Quality management is essential to ensure that IT is delivering value to the business, continuous improvement and transparency for stakeholders.

The control objectives within the process are:• 8.1 Quality management system• 8.2 IT standards and quality practices• 8.3 Development and acquisition standards• 8.4 Customer focus• 8.5 Continuous improvement• 8.6 Quality measurement, monitoring and review

There are a number of elements within PO8. Some have ‘harder’ or more tangible aspects, such as the development of quality practices and development and acquisition standards. Others have ‘softer’ aspects, such as the promulgation of continuous improvement and development of a customer focus. As a result, some managers found it somewhat problematic to report process maturity levels for this process. Given this caveat, managers generally reported relatively low levels of maturity, as shown in Figure 39. Again, there is a relatively low dispersion, with few outliers. Very few enterprises reported systematic quality management approaches, such as adoption of ISO 9000. This is confirmed by the information gleaned from CIOs when they were asked about ISO 9000 adoption, which barely appears on their radar screen. More investigation will be needed to better understand the interaction of this process with other cognate processes, particularly in the AI domain.

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

Figure 38—Process Maturity for PO7 Manage IT Human Resources

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 39—Process Maturity for PO8 Manage Quality

IT Governance and Process Maturity

42 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

PO9 Assess and Manage IT RisksA risk management framework is created and maintained. The framework documents a common and agreed-upon level of IT risks, mitigation strategies and residual risks. Any potential impact on the goals of the organisation caused by an unplanned event is identified, analysed and assessed. Risk mitigation strategies are adopted to minimise residual risk to an accepted level. The result of the assessment is understandable to the stakeholders and expressed in financial terms, to enable stakeholders to align risk to an acceptable level of tolerance.

The control objectives within the process are:• 9.1 IT risk management framework• 9.2 Establishment of risk context• 9.3 Event identification• 9.4 Risk assessment• 9.5 Risk response• 9.6 Maintenance and monitoring of a risk action plan

The process of managing risk is a key aspect of CobiT and general governance. There were some enterprises that had systematically built risk identification, appraisal and management solutions, and some use CobiT as the framework around which IT and associated business risks are managed. Most enterprises that participated, however, have widely varying and relatively immature processes within this area. For some, this is related in part to immature and developing processes at the enterprise level.

However, there was considerable recognition of the need to improve performance on this process. Many enterprises had active programmes underway to build risk identification, assessment and management maturity. Amongst managers who have specific IT governance responsibility, there was a common theme that this was high on their ‘to do’ list.

PO10 Manage ProjectsA programme and project management framework for the management of all IT projects is established. The framework should ensure the correct prioritisation and co-ordination of all projects. The framework includes a master plan, assignment of resources, definition of deliverables, approval by users, a phased approach to delivery, quality assurance, a formal test plan, and testing and post-implementation review after installation to ensure project risk management and value delivery to the business. This approach reduces the risk of unexpected costs and project cancellations, improves communications to and involvement of business and end users, ensures the value and quality of project deliverables, and maximises their contribution to IT-enabled investment programmes.

The control objectives within the process are:• 10.1 Programme management framework• 10.2 Project management framework• 10.3 Project management approach• 10.4 Stakeholder commitment• 10.5 Project scope statement• 10.6 Project phase initiation• 10.7 Integrated project plan• 10.8 Project resources• 10.9 Project risk management• 10.10 Project quality plan• 10.11 Project change control

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 40—Process Maturity for PO9 Assess and Manage IT Risks

4. Process Maturity by Domain and Process

43© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

• 10.12 Project planning of assurance methods• 10.13 Project performance measurement, reporting and monitoring• 10.14 Project closure

Once again, this large and important process was broken into two subprocesses. The first, which is designated PO10PG Manage projects—Programme, covers those elements of PO10 that relate to the management of the investment programme, as distinct from the management of individual projects. This includes PO10.1, which states:

Maintain the programme of projects, related to the portfolio of IT-enabled investment programmes, by identifying, defining, evaluating, prioritising, selecting, initiating, managing and controlling projects. Ensure that the projects support the programme’s objectives. Co-ordinate the activities and interdependencies of multiple projects, manage the contribution of all the projects within the programme to expected outcomes, and resolve resource requirements and conflicts.

In addition, there are elements of PO10.2 through PO10.4. There is also a strong correlation between the PO10PG subprocess and ITGI’s Val IT framework. The remainder of PO10 was incorporated into a subprocess designated PO10PJ Manage projects—Projects.

A great deal of variation was observed within the PO10PG Manage projects—Programme subprocess. As shown in figure 41, the overall maturity level was only moderate, with the tail trending to lower levels of maturity. A variety of factors led to this result. In some IT organisations—particularly smaller ones—there was little perceived necessity to build sophisticated programme management processes given the relatively small number of projects at any one time or within a year. For others, there was a clear recognition that much needed to be done. Some managers were very straightforward in their responses. They openly responded that there were few formal elements of a programme management process and clearly admitted that programme management was needed and was on the priority list.

Responses were very different for PO10PJ Manage projects—Projects. Figure 42 shows higher levels of maturity than for the sibling subprocess and less diversity in the responses. Impressive clarity and responsiveness characterised the responses of many of the managers to the assessment of PO10PJ. Clearly, this is an area that has received a great deal of attention and investment in IT organisations in recent years. This positive approach strongly correlates with the interviews with the CIOs. They were asked, for example, whether they undertook post-project implementation reviews with the business, and 60 percent either agreed or strongly agreed. More than 60 percent also either agreed or strongly agreed with the statement ’We have staff (in IT) with business-facing responsibilities’.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 41—Process Maturity for PO10PG Manage Projects—Programme

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 42—Process Maturity for PO10PJ Manage Projects—Projects

IT Governance and Process Maturity

44 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Acquire and Implement Domain

The AI domain covers all aspects of the development and acquisition cycle for hardware, software and services. This was a domain that managers had some difficulty in understanding and clearly identifying process owners. It is also an area in which more interviews were conducted per process than would be typical in other domains, indicating a lack of clear match from CobiT process to staff function within the enterprise.

AI1 Identify Automated SolutionsThe need for a new application or function requires analysis before acquisition or creation to ensure that business requirements are satisfied in an effective and efficient approach. This process covers the definition of the needs, consideration of alternative sources, review of technological and economic feasibility, execution of a risk analysis and cost-benefit analysis, and conclusion of a final decision to ‘make’ or ‘buy’. All these steps enable organisations to minimise the cost to acquire and implement solutions while ensuring that they enable the business to achieve its objectives.

The control objectives within the process are:• 1.1 Definition and maintenance of business functional and technical requirements• 1.2 Risk analysis report• 1.3 Feasibility study and formulation of alternative courses of action• 1.4 Requirements and feasibility decision and approval

The AI1 process is a key process at the commencement of the development and acquisition cycle. Figure 43 shows that there is a relatively high level of maturity across the process maturity attributes. Areas that give rise to lower levels of maturity include the systematic development of alternative courses of action, which seems to be a challenge for many IT organisations, and formalised development of risk analyses. The latter area has become more important in recent years.

AI2 Acquire and Maintain Application SoftwareApplications are made available in line with business requirements. This process covers the design of the applications, the proper inclusion of application controls and security requirements, and the actual development and configuration in line with standards. This allows organisations to properly support business operations with the correct automated applications.

The control objectives within the process are:• 2.1 High-level design• 2.2 Detailed design• 2.3 Application control and auditability• 2.4 Application security and availability• 2.5 Configuration and implementation of acquired application software• 2.6 Major upgrades to existing systems• 2.7 Development of application software• 2.8 Software quality assurance• 2.9 Applications requirements management• 2.10 Application software maintenance

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 43—Process Maturity for AI1 Identify Automated Solutions

4. Process Maturity by Domain and Process

45© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

This process is large and complex. As seen in figure 44, maturity is relatively high, but there is significant variance across the process maturity attributes. Some enterprises have thoroughly developed methodologies and protocols that are embedded in systematic policies and procedures. This can be seen by the high level of maturity for the policies and procedures attribute, where the median is at 3 and the upper whisker is at 4. A similar level of maturity is seen for management’s awareness and communication. This is an interesting result. The typical response level in this study is for respondents to score the policies attribute with lower levels of maturity than those for the awareness attribute. The pattern for AI2 is counter to that normal pattern. In good measure, this is as a result of the enterprise’s methodologies and protocols.

There are also a number of enterprises that have relatively low levels of maturity on this process. These enterprises are typically smaller and do not have a strong history in development activities.

AI3 Acquire and Maintain Technology InfrastructureOrganisations should have processes for the acquisition, implementation and upgrade of the technology infrastructure. This requires a planned approach to acquisition, maintainance and protection of infrastructure in line with agreed-upon technology strategies and the provision of development and test environments. This ensures that there is ongoing technological support for business applications.

The control objectives within the process are:• 3.1 Technological infrastructure acquisition plan• 3.2 Infrastructure resource protection and availability• 3.3 Infrastructure maintenance• 3.4 Feasibility test environment

In comparison with AI2, AI3 is a relatively straightforward process. Figure 45 shows a process that is relatively under control but not necessarily managed at the highest levels of maturity. Most enterprises reported high maturity levels on some aspects of AI3, but few were able to demonstrate high levels of maturity with each of the detailed control objectives within the process.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 44—Process Maturity for AI2 Acquire and Maintain Application Software

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 45—Process Maturity for AI3 Acquire and Maintain Technology Infrastructure

IT Governance and Process Maturity

46 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

AI4 Enable Operation and UseKnowledge about new systems is made available. This process requires the production of documentation and manuals for users and IT, and provides training to ensure the proper use and operation of applications and infrastructure.

The control objectives within the process are:• 4.1 Planning for operational solutions• 4.2 Knowledge transfer to business management• 4.3 Knowledge transfer to end users• 4.4 Knowledge transfer to operations and support staff

AI4 covers all the aspects of bringing new applications and technologies to the end-user community. This is clearly a key aspect of acquisition, development and rollout of new solutions. The results were suprising for this process, and figure 46 shows a very interesting pattern indicating that the respondents were ‘all over the map’. Some organisations had an ongoing change programme with a flow of new or upgraded applications. These organisations, as might be expected, showed higher levels of maturity. The most mature organisation in this group had clearly built mature and reusable processes to handle the rollout and training process; however, some respondents stated that they had few formalised processes. A number of managers noted their organisation’s low level of maturity and also recognised the need for improvement. Interestingly, some of the organisations that had well-established project management and development methodologies and processes did not have matching processes to roll out the solutions.

AI5 Procure IT ResourcesIT resources, including people, hardware, software and services, need to be procured. This requires the definition and enforcement of procurement procedures, the selection of vendors, the setup of contractual arrangements and the actual acquisition itself. Doing so ensures that the organisation has all required IT resources in a timely and cost-effective manner.

The control objectives within the process are:• 5.1 Procurement control• 5.2 Supplier contract management• 5.3 Supplier selection• 5.4 Resources acquisition

The AI5 process brings together most aspects of the procurement of software, hardware and services. Along with processes such as PO5 and PO7, AI5 is closely aligned with enterprise-wide management processes. Well-managed procurement processes at the enterprise level increase maturity of procurement processes within the IT function.

Figure 47 shows evidence of a process that respondents felt was reasonably well under control.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 46—Process Maturity for AI4 Enable Operation and Use

4. Process Maturity by Domain and Process

47© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

AI6 Manage ChangesAll changes, including emergency maintenance and patches, relating to infrastructure and applications within the production environment are formally managed in a controlled manner. Changes (including those to procedures, processes, system and service parameters) must be logged, assessed and authorised prior to implementation and reviewed against planned outcomes following implementation. This assures mitigation of the risks of negatively impacting the stability or integrity of the production environment.

The control objectives within the process are:• 6.1 Change standards and procedures• 6.2 Impact assessment, prioritisation and authorisation• 6.3 Emergency changes• 6.4 Change status tracking and reporting• 6.5 Change closure and documentation

AI6 is a key process in the management of internal controls. It features strongly in the response to the US Sarbanes-Oxley Act compliance requirements. Several of the enterprises in the study were either directly or indirectly impacted by the Sarbanes-Oxley Act because regulators in their industry or country were imposing compliance requirements like Sarbanes-Oxley or due to voluntary adoption of Sarbanes-Oxley Act requirements. Several of the respondents either had adopted or were in the process of adopting ITIL, or ITIL had influenced their own policy and procedure formulation. While there are many aspects to ITIL, change management has been one of the first elements to be adopted by enterprises.

Figure 48 also shows an interesting pattern. Some outliers have established good management practices over policies and procedures and tools and techniques. As the lower whiskers on these attributes indicate, others showed very little maturity on either of these dimensions. Regardless of the state of actual process improvement, there was a strong and consistent belief expressed by managers that this was a key process that was high on their radar. For those subject to the Sarbanes-Oxley Act or similar compliance requirements, AI6 was almost always strongly featured as a process that needed to be thoroughly managed. This was true whether or not the enterprise was formally adopting CobiT.

Automated tools can play an important role in the AI6 process. Several managers indicated that they had already acquired or would be acquiring automated change management solutions. Conversely, most of the enterprises in the Philippines and Mexico said they could not afford automated solutions. The better managed of these enterprises were using considerable levels of human resources to manage and control changes.

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 47—Process Maturity for AI5 Procure IT Resources

Proc

ess

Mat

urity

54

32

1

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 48—Process Maturity for AI6 Manage Changes

IT Governance and Process Maturity

48 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

AI7 Install and Accredit Solutions and ChangesNew systems need to be made operational once development is complete. This requires proper testing in a dedicated environment with relevant test data, definition of rollout and migration instructions, release planning and actual promotion to production, and a post-implementation review. This assures that operational systems are in line with the agreed-upon expectations and outcomes.

The control objectives within the process are:• 7.1 Training• 7.2 Test plan• 7.3 Implementation plan• 7.4 Test environment• 7.5 System and data conversion• 7.6 Testing of changes• 7.7 Final acceptance test• 7.8 Promotion to production• 7.9 Post-implementation review

AI7 can be seen as closely aligned with AI6. Indeed, many managers repeated their responses from AI6 for AI7, as they perceived AI6 and AI7 to be two aspects of the same IT process. Figure 49 shows this pattern, with a slightly lower maturity level for AI7 than AI6. This reflects somewhat inconsistent approaches to all of the requirements of the process. For example, some managers reported that their test environments were not all that they might wish them to be or that test plans differ widely across technology platforms.

Deliver and Support Domain

The DS domain is at the heart of the day-to-day work of the IT function. As might be expected, a full third of the CobiT processes are in this domain. In general, the level of maturity for processes within the DS domain was relatively high. There were, however, some key aspects of the domain that showed lower and very diverse levels.

DS1 Define and Manage Service LevelsEffective communication between IT management and business customers regarding services required is enabled by a documented definition of and agreement on IT services and service levels. This process also includes monitoring and timely reporting to stakeholders on the accomplishment of service levels. This process enables alignment between IT services and the related business requirements.

The control objectives within the process are:• 1.1 Service level management framework• 1.2 Definition of services• 1.3 Service level agreements• 1.4 Operating level agreements• 1.5 Monitoring and reporting of service level achievements• 1.6 Review of service level agreements and contracts

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 49—Process Maturity for AI7 Install and Accredit Solutions and Changes

4. Process Maturity by Domain and Process

49© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS1 is another process where there is great diversity and only moderate levels of process maturity. Some enterprises, represented in the upper whiskers on the box plot in figure 50, have worked to build formalised service level agreements (SLAs) with the business, supported by operating level agreements (OLAs). Others have embarked on the process and there is much activity related to this process. Several respondents remarked that this is a difficult process that requires careful co-ordination with the business. The diversity of responses on the responsibility attribute shown in figure 50 is indicative of the issues involved in interacting with the consumers of IT services.

DS2 Manage Third-party ServicesThe need to assure that services provided by third parties (suppliers, vendors and partners) meet business requirements requires an effective third-party management process. This process is accomplished by clearly defining the roles, responsibilities and expectations in third-party agreements as well as by reviewing and monitoring such agreements for effectiveness and compliance. Effective management of third-party services minimises the business risk associated with non-performing suppliers.

The control objectives within the process are:• 2.1 Identification of all supplier relationships• 2.2 Supplier relationship management• 2.3 Supplier risk management• 2.4 Supplier performance monitoring

Although a few of the respondents had very high levels of outsourcing, all respondents had significant budgets for the supply of software, hardware and particularly services. The management tasks set out in DS2 are relevant to all of the respondents in the study. In many ways, DS2 is the mirror image of DS1, to manage the providers of services and solutions. Figure 51 indicates that DS2 has a slightly higher overall level of maturity than DS1. As seen in the figure, and noted in discussions with the respondents, there is a great deal of variation in the process maturity in the areas covered by DS2. For some respondents, all aspects of DS2 were covered by systematic policies and procedures and monitoring methods. In some cases, there were databases developed inhouse of supplier availability and performance. But for others, management of suppliers and supplier relationships were handled in a largely ad hoc fashion. Some had coverage on some of the detailed control objectives but not on others. Commonly, DS2.3 Supplier risk management and 2.4 Supplier performance monitoring were managed in a less sophisticated manner than the other control objectives.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 50—Process Maturity for DS1 Define and Manage Service Levels

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 51—Process Maturity for DS2 Manage Third-party Services

IT Governance and Process Maturity

50 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS3 Manage Performance and CapacityThe need to manage performance and capacity of IT resources requires a process to periodically review current performance and capacity of IT resources. This process includes forecasting future needs based on workload, storage and contingency requirements. This process provides assurance that information resources supporting business requirements are continually available.

The control objectives within the process are:• 3.1 Performance and capacity planning• 3.2 Current capacity and performance• 3.3 Future capacity and performance• 3.4 IT resources availability• 3.5 Monitoring and reporting

The DS3 process is important to the enterprise’s maintenance of future maturity. Figure 52 shows that this process maturity is not particularly high compared to other processes. The way the enterprises handle this process is interesting. For hardware and networking, several of the participating enterprises purchased sufficient capacity to meet foreseeable needs. This was feasible given the relatively low cost of computing and networking resources. A couple of the public-sector enterprises had outsourced provision of hardware and networking to obviate the need to enter into complex procurement processes. Conversely, there was little evidence of systems in place to manage the forecasting of other IT resources.

DS4 Ensure Continuous ServiceThe need for providing continuous IT services requires developing, maintaining and testing IT continuity plans, utilising offsite backup storage and providing periodic continuity plan training. An effective continuous service process minimises the probability and impact of a major IT service interruption on key business functions and processes.

The control objectives within the process are:• 4.1 IT continuity framework• 4.2 IT continuity plans• 4.3 Critical IT resources• 4.4 Maintenance of the IT continuity plan• 4.5 Testing of the IT continuity plan• 4.6 IT continuity plan training• 4.7 Distribution of the IT continuity plan• 4.8 IT services recovery and resumption• 4.9 Offsite backup storage• 4.10 Post-resumption review

The results for DS4, shown in figure 53, were rather surprising and even worrying. Conventional expectations were that this process was likely to be mature and well established. This is not borne out by the evidence in figure 53 or through interactions with the respondents. There are 10 control objectives within the process. All respondents seem to have a good handle on a minority of these control objectives. Very few seem to have achieved a high level of maturity on all of the elements of DS4. There seemed to be attention given to improving performance with this process, but perhaps not at the same level of intensity as with other processes.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 52—Process Maturity for DS3 Manage Performance and Capacity

4. Process Maturity by Domain and Process

51© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS5 Ensure Systems SecurityThe need to maintain the integrity of information and protect IT assets requires a security management process. This process includes establishing and maintaining IT security roles and responsibilties, policies, standards, and procedures. Security management also includes performing security monitoring and periodic testing and implementing corrective actions for identified security weaknesses or incidents. Effective security management protects all IT assets to minimise the business impact of security vulnerabilities and incidents.

The control objectives within the process are:• 5.1 Management of IT security• 5.2 IT security plan• 5.3 Identity management• 5.4 User account management• 5.5 Security testing, surveillance and monitoring• 5.6 Security incident definition• 5.7 Protection of security technology• 5.8 Cryptographic key management• 5.9 Malicious software prevention, detection and correction• 5.10 Network security• 5.11 Exchange of sensitive data

DS5 Ensure systems security is one of the most important processes within CobiT. It features prominently in any view of the Sarbanes-Oxley Act or other compliance. Similarly, most traversals from organisation (or business) goals to IT goals to IT processes will see DS5 ranked at the high or moderate level.13 As a result of this importance and due to the broad range of elements within DS5, the process was broken into four separate processes: • DS5P Ensure systems security—Policy primarily encompasses DS5.1 and DS5.2. • DS5U Ensure systems security—User access relates primarily to DS5.3 and DS5.4.• DS5NF Ensure systems security—Network and firewall covers several control objectives, including DS5.7, DS5.8 and DS5.10. • DS5V Ensure systems security—Virus relates primarily to DS5.9.

Taking DS5 as a whole, it is clear that considerable investment has been made in enhancing security over the last few years. Many respondents have made much progress in hardening all aspects of their security framework, yet there is also much variation amongst respondents and within the processes that make up the DS5 family. Many enterprises had built maturity at high levels across each of the processes. Further, there was much discussion of forward progress with several aspects of DS5. This includes both ‘soft’ and ‘hard’ aspects of security.

Figure 54 shows high levels of maturity for DS5 Ensure systems security—Network and firewall. Note that the whiskers go to level 5 across all attributes except for, unsurprisingly, goals and metrics. Yet, the whiskers also go down to level 1 and below across the board. Again, this is partly a size issue, with larger enterprises demonstrating higher levels of maturity. It is also related to the level of national development. The mean overall maturity for enterprises in developing countries was 2.4, whereas it was slightly higher than 3 for those in developed countries.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 53—Process Maturity for DS4 Ensure Continuous Service

13 See appendix I of CobiT 4.1.

IT Governance and Process Maturity

52 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 55 shows the detailed process maturity for DS5P Ensure systems security—Policy, which has moderate levels of maturity. Many respondents had worked hard on strengthening policy making and security awareness communication mechanisms. Most firms recognised high levels of enterprise-wide attention to security, although as seen in Figure 55 there is a great deal of diversity. Respondents paid particular attention to the need to harden planning processes and improve security certification of and security awareness amongst employees.

The level of maturity for DS5U Ensure systems security—User access (figure 56) was lower than for DSS Ensure systems security—Network and firewall. Managers reported difficulties in hardening this particular area of security, given the need to bring the enterprise along with the security and operations teams within the IT function.

Figure 57 shows that the level of maturity for DS5V Ensure systems security—Virus was high—indeed, one of the highest of any processes in the study. Many respondents believed that they could do little to improve the level of maturity. This was not a universal reaction, of course, as can be seen from the lower whiskers in the figure.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 55—Process Maturity for DS5P Ensure Systems Security—Policy

Proc

ess

Mat

urity

54

32

1

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 56—Process Maturity for DS5U Ensure Systems Security—User Access

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 54—Process Maturity for DS5NF Ensure Systems Security—Network and Firewall

4. Process Maturity by Domain and Process

53© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS6 Identify and Allocate CostsThe need for a fair and equitable system of allocating IT costs to the business requires accurate measurement of IT costs and agreement with business users on fair allocation. This process includes building and operating a system to capture, allocate and report IT costs to the users of services. A fair system of allocation enables the business to make more informed decisions regarding use of IT services.

The control objectives within the process are:• 6.1 Definition of services• 6.2 IT accounting• 6.3 Cost modelling and charging• 6.4 Cost model maintenance

The survey produced very different, essentially bifurcated, responses for the DS6 Identify and allocate costs process. As shown in figure 58, there was wide dispersion on each attribute. Forty percent of enterprises used chargeback mechanisms; these firms had built processes to support this allocation mechanism. While there were exceptions to this general rule, where chargeback was not used, maturity was lower. Overall maturity on this process for those not using chargeback was lower than 2 as compared with higher than 3 for enterprises that did use chargeback.

DS7 Educate and Train UsersEffective education of all users of IT systems, including those within IT, requires identifying the training needs of each user group. In addition to identifying needs, this process includes defining and executing a strategy for effective training and measuring the results. An effective training programme increases the effective use of technology by reducing user errors, increasing productivity and increasing compliance with key controls, such as user security measures.

The control objectives within the process are:• 7.1 Identification of education and training needs• 7.2 Delivery of training and education• 7.3 Evaluation of training received

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 57—Process Maturity for DS5V Ensure Systems Security—Virus

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 58—Process Maturity for DS6 Identify and Allocate Costs

IT Governance and Process Maturity

54 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS7 Educate and train users is a straightforward process but one that is vital for the success of the IT function. Figure 59 shows moderate levels of maturity. In some cases, staff members from human resources and IT were interviewed to collect data on DS7. Several respondents noted that there were probably higher levels of success and maturity with the training and education of end users than for staff members within IT. Again, this was a process with tremendous disparity amongst respondents, and there were no obvious size or other demographic differences.

DS8 Manage Service Desk and IncidentsA timely and effective response to IT user queries and problems requires a well-designed and well-executed service desk and incident management process. This process includes setting up a service desk function with registration, incident escalation, trend and root cause analysis, and resolution. The business benefits include increased productivity through quick resolution of user queries. In addition, the business can address root causes (such as poor user training) through effective reporting.

The control objectives within the process are:• 8.1 Service desk• 8.2 Registration of customer queries• 8.3 Incident escalation• 8.4 Incident closure• 8.5 Trend analysis

DS8 Manage service desk and incidents is an important component of service delivery to end users. This process is also a key element of ITIL and ITIL rollout amongst respondents whose enterprises were adopting ITIL. However, figure 60 shows a moderate level of process maturity for DS8. Many respondents reported having rolled out software applications to manage the service desk, as can be seen in the technology attribute in figure 60. The level of maturity for goals and metrics for this process is relatively high, a byproduct of service desk software adoption.

Proc

ess

Mat

urity

54

32

1Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 59—Process Maturity for DS7 Educate and Train Users

Figure 60—Process Maturity for DS8 Manage Service Desk and Incidents

4. Process Maturity by Domain and Process

55© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS9 Manage the ConfigurationEnsuring the integrity of hardware and software configurations requires the establishment and maintenance of an accurate and complete configuration repository. This process includes collecting initial configuration information, establishing baselines, verifying and auditing configuration information, and updating the configuration repository as needed. Effective configuration management facilitates greater system availability, minimises production issues and resolves issues more quickly.

The control objectives within the process are:• 9.1 Configuration repository and baseline• 9.2 Identification and maintenance of configuration items• 9.3 Configuration integrity review

Given the importance of configuration management in ITIL and as part of security hardness, the results shown in figure 61 for DS9 Manage the configuration were particularly surprising. A number of enterprises reported ad hoc approaches to tracking configuration. For some, there was variation across technology lines. For example, configuration management for mainframe-based technologies was well developed, but configuration for UNIX and other client-server environments was not nearly as well developed. In general, configuration management of software was not as well developed as it was for hardware. Interestingly, there was little evidence of significant investment being made or forward movement being planned.

DS10 Manage ProblemsEffective problem management requires the identification and classification of problems, root cause analysis and resolution of problems. The problem management process also includes identification of recommendations for improvement, maintenance of problem records and review of the status of corrective actions. An effective problem management process maximises system availability, improves service levels, reduces costs, and improves customer convenience and satisfaction.

The control objectives within the process are:• 10.1 Identification and classification of problems• 10.2 Problem tracking and resolution• 10.3 Problem closure• 10.4 Integration of configuration, incident and problem management

DS10 Manage problems is closely related to DS9 as well as to service delivery and quality management in general. Figure 62 shows similar levels of maturity as for DS9. Clearly, this process is also benefiting from investment in supporting technology for service desks and the like. However, respondents reported a great deal more variation than for DS9. Some respondents discussed the difficulty in maintaining consistent approaches to problem management amongst different project management and development teams or across technology lines. Investment in DS10 also seems to be following improvements in DS9.

DS11 Manage DataEffective data management requires identifying data requirements. The data management process also includes the establishment of effective procedures to manage the media library, backup and recovery of data, and proper disposal of media. Effective data management helps ensure the quality, timeliness and availability of business data.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 61—Process Maturity for DS9 Manage the Configuration

IT Governance and Process Maturity

56 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

The control objectives within the process are:• 11.1 Business requirements for data management• 11.2 Storage and retention arrangements• 11.3 Media library management system• 11.4 Disposal• 11.5 Backup and restoration• 11.6 Security requirements for data management

The last three processes within the DS domain are mature and are maturely managed. Figure 63 shows the process maturity levels for DS11 Manage data. It shows high levels of maturity, with strong support in established policies and procedures and appropriate technology foundations.

DS12 Manage the Physical EnvironmentProtection for computer equipment and personnel requires well-designed and well-managed physical facilities. The process of managing the physical environment includes defining the physical site requirements, selecting appropriate facilities, and designing effective processes for monitoring environmental factors and managing physical access. Effective management of the physical environment reduces business interruptions from damage to computer equipment and personnel.

The control objectives within the process are:• 12.1 Site selection and layout• 12.2 Physical security measures• 12.3 Physical access• 12.4 Protection against environmental factors• 12.5 Physical facilities management

Figure 64 shows the maturity levels for DS12 Manage the physical environment. As might be expected, this is at an even higher level of maturity than DS11. This is one of the most developed processes, with relatively little dispersion. Clearly, a great deal of effort has gone into this process over many years.

Proc

ess

Mat

urity

54

32

1

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 63—Process Maturity for DS11 Manage Data

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 62—Process Maturity for DS10 Manage Problems

4. Process Maturity by Domain and Process

57© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

DS13 Manage OperationsComplete and accurate processing of data requires effective management of data processing procedures and diligent maintenance of hardware. This process includes defining operating policies and procedures for effective management of scheduled processing, protecting sensitive output, monitoring infrastructure and ensuring preventive maintenance of hardware. Effective operations management helps maintain data integrity and reduces business delays and IT operating costs.

The control objectives within the process are:• 13.1 Operations procedures and instructions• 13.2 Job scheduling• 13.3 IT infrastructure monitoring• 13.4 Sensitive documents and output devices• 13.5 Preventive maintenance for hardware

DS13 Manage operations was not at quite the same level of maturity as DS11 and DS12, although, as figure 65 demonstrates, the maturity level is still reasonable. The main variation found was between technologies. Mainframe operations were well established and highly mature, but the day-to-day management of operations for client-server and newer technologies was not at nearly the same level.

Monitor and Evaluate Domain

As reported in chapter 2, the overall maturity for processes in this domain is relatively low. There was a general acceptance by managers of the need to make significant improvements in this domain.

ME1 Monitor and Evaluate IT PerformanceEffective IT performance management requires a monitoring process. This process includes defining relevant performance indicators, systematic and timely reporting of performance, and prompt acting upon deviations. Monitoring is needed to make sure that the right things are done and are in line with the set directions and policies.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 64—Process Maturity for DS12 Manage the Physical Environment

Proc

ess

Mat

urity

54

32

1

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 65—Process Maturity for DS13 Manage Operations

IT Governance and Process Maturity

58 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

The control objectives within the process are:• 1.1 Monitoring approach• 1.2 Definition and collection of monitoring data• 1.3 Monitoring method• 1.4 Performance assessment• 1.5 Board and executive reporting• 1.6 Remedial actions

Figure 66 shows that the level of maturity for ME1 Monitor and evaluate IT performance was rather low. Moderately high levels of managerial awareness were not matched by support in appropriate policies and procedures, tools or staffing support. There were exceptions where a limited number of enterprises were using techniques such as the balanced scorecard as a foundation for ME1. However, these were the exception to seemingly ad hoc approaches to development of monitoring capabilities.

ME2 Monitor and Evaluate Internal ControlEstablishing an effective internal control programme for IT requires a well-defined monitoring process. This process includes the monitoring and reporting of control exceptions, results of self-assessments and third-party reviews. A key benefit of internal control monitoring is to provide assurance regarding effective and efficient operations and compliance with applicable laws and regulations.

The control objectives within the process are:• 2.1 Monitoring of internal control framework• 2.2 Supervisory review• 2.3 Control exceptions• 2.4 Control self-assessment• 2.5 Assurance of internal control• 2.6 Internal control at third parties• 2.7 Remedial actions

Figure 67 shows fairly low levels of maturity for ME2 Monitor and evaluate internal control. Several of the enterprises reported much more active IT audit activities by internal auditors than there had been in previous years and, in some enterprises, data were collected from internal audit as well as from the IT function. Yet, there seemed to be generally low levels of maturity for each of the aspects of ME2.

ME3 Ensure Compliance With External RequirementsEffective oversight of compliance requires the establishment of review process to ensure compliance with laws, regulations and contractual requirements. This process includes identifying compliance requirements, optimising and evaluating the response, obtaining assurance that the requirements have been complied with and, finally, integrating IT’s compliance reporting with the rest of the business.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 66—Process Maturity for ME1 Monitor and Evaluate IT Performance

4. Process Maturity by Domain and Process

59© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

The control objectives within the process are:• 3.1 Identification of external legal, regulatory and contractual compliance requirements• 3.2 Optimisation of response to external requirements• 3.3 Evaluation of compliance with external requirements• 3.4 Positive assurance of compliance• 3.5 Integrated reporting

The picture for ME3 Ensure compliance with external requirements is a little different than it is for ME1 and ME2. Figure 68 shows similar overall maturity, but a great deal more variation. Larger enterprises in highly regulated industries, such as utilities, healthcare and financial services, tended to be more organised than other respondents. Some relied on compliance officers or internal audit for environmental scanning and updates. In a very few cases, there was a separate function within IT to monitor IT-specific compliance issues. In contrast, several other respondents reported rather ad hoc and highly limited approaches to monitoring and reporting compliance requirements.

ME4 Provide IT GovernanceEstablishing an effective governance framework includes defining organisational structures, processes, leadership, roles and responsibilities to ensure that enterprise IT investments are aligned and delivered in accordance with enterprise strategies and objectives.

The control objectives within the process are:• 4.1 Establishment of an IT governance framework• 4.2 Strategic alignment• 4.3 Value delivery• 4.4 Resource management• 4.5 Risk management• 4.6 Performance measurement• 4.7 Independent assurance

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 67—Process Maturity for ME2 Monitor and Evaluate Internal Control

Proc

ess

Mat

urity

54

32

10

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 68—Process Maturity for ME3 Ensure Compliance With External Requirements

IT Governance and Process Maturity

60 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

B

C

D

A

Average Ranks

Varia

tion

in R

anks

(Sta

ndar

d De

viatio

n)

20

18

16

14

12

10

8

6

4

2

0

0 5 10 15 20 25 30 35 40 45 50

Higher performers

Inconsistent moderate

performers

Moderate performers

Lower performers

Figure 70—Overall Performance

ME4 Provide IT governance can be seen as a wraparound for the whole process of ensuring governance over IT. Figure 69 shows that, once again, the overall maturity was low. The lower whiskers on each of the attributes are all at a level lower than 1 and the upper whiskers are at a level greater than 4. This indicates that, at least for this group of respondents, managers see the level of IT governance as being immature and in need of significant improvement. Yet, there is much improvement on the horizon. Many enterprises report appointment of managers in IT governance positions and several such managers participated in this study. Systematic development of IT governance is clearly on the agenda for many of the enterprises studied, but managers reported that much needs to be done to improve this process.

Overall Enterprise Performance

The preceding discussion provided a process-by-process analysis of the various internal and external influences on maturity levels. Following is an analysis of the overall maturity level of the individual enterprises within the overall pool. The analysis focuses on two dimensions of overall performance: relative maturity against their peers and consistency in maturity. First, each enterprises maturity for each process was ranked from one to 51. This analysis ignores the relative maturity of the various processes. Second, the average ranks across all of the processes were calculated. At the same time, the variation14 in ranks was calculated—did an enterprise have a common position within the ranking, or did the enterprise rank higher on some processes and lower on others? The results of this analysis are shown in Figure 70.

Proc

ess

Mat

urity

54

32

01

Overall

Awareness

Policies

Technology

Skills

Responsibility

Goals

Figure 69—Process Maturity for ME4 Provide IT Governance

14 Variation is measured by standard deviation.

4. Process Maturity by Domain and Process

61© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 70 demonstrates an interesting outcome. There are four distinct groups. Groups A and D are mirror images of each other. The eight enterprises in group D are the high-maturity enterprises. The outlier at the very top of the ranking has an average ranking of 45. Recall that there are 51 enterprises so this enterprise, a large service provider, is the highest-ranked enterprise for 9 out of 10 processes. This is not solely a function of size. The next-highest-ranked enterprise is a financial services enterprise (mean ranking of 42.3) that was a medium-sized enterprise within the study. Conversely, the five enterprises in group A are the enterprises with consistently lower maturity. These enterprises are smaller (up to 80 personnel) and are also in the service sector. Each of these enterprises is presented with considerable challenges in gaining maturity and prioritising IT governance initiatives.

The bulk of the enterprises are in group C. These are ranked in the middle without a great deal of variation in their rankings. Group B is a particularly interesting group with an average ranking in maturity, but, in contrast to group C, considerable variation in those rankings. The enterprise with the highest level of variation (enclosed by a dotted line in group B, has 180 staff members and is in the capital-intensive industry group. It is either the lowest-ranked or second-lowest-ranked enterprise for all of the processes in the ME domain. Probably not coincidentally, it is the third-lowest-ranked enterprise for PO8 Manage quality. Conversely, it is within the five highest-ranked enterprises for no fewer than six processes and ranked between the sixth and tenth enterprise for another six additional processes. This is an enterprise that can demonstrate relatively high levels of maturity for a significant number of processes, but has very low relative maturity on an important group of processes.

Chapter Summary and Concluding Comments

In this chapter, maturity for each of the 41 processes investigated was reviewed with the managers in the study. Several of the outcomes were predictable. For example, the strong results for the various components of DS5 Ensure systems security, DS11 Manage data and DS12 Manage the physical environment were not a surprise. Similarly, the low level of maturity for each of the processes in the Monitor and Evaluate (ME) domain was in line with previous experience with implementation of IT governance. The levels of maturity for some other processes were a surprise. For example, the level of maturity for PO2 Define the information architecture was lower than expected. Similarly, maturity for DS4 Ensure continuous service was unexpectedly low given the centrality of this process to overall enterprise reliability in case of environmental disruptions.

The mean maturity levels for the enterprises in the study were also reviewed. It was found that some enterprises consistently excelled across the complete range of processes. Conversely, others were consistently lower in their relative maturity levels for most of the processes. These enterprises need to take broad IT governance initiatives to meet the maturity levels of their peers. Within the larger group of those with middling levels of overall maturity, there were some enterprises that demonstrated relatively highly maturity levels for some processes but much lower levels for other processes.

Taken as a whole, the analysis in this chapter provides a snapshot of the state of play for the maturity of the processes that make up the bulk of activity within the scope of the IT organisation. For many of the processes, there is significant diversity amongst the enterprises in the sample. Even for processes that are relatively mature on average, there are enterprises that are not nearly so mature. Some processes encompass the complete spectrum, with enterprises ranging in maturity from level 1 or even lower.

IT Governance and Process Maturity

62 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

5. Associating IT Governance With Process MaturityUp to this point in the report, the research results have been presented generally for the 51 enterprises as one group. This chapter reports on a statistical analysis of the data to determine if the maturity results varied by the different characteristics of the enterprises:• Geographic location• Level of national development• Industry• Size of IT organisations• IT spending as a percentage of revenue• Alignment of business and IT goals• Level of outsourcing• IT governance structure

Figure 71 is a heat map that indicates which processes had statistically different maturity levels (as indicated by the red boxes) for each of the variables listed above. Figure 72 is a heat map that indicates which attributes had statistical different levels (as indicated by the red boxes) for each of the variables.15 An initial look at the heat maps indicates some interesting findings that will be discussed in more detail in this section:• The degree of alignment between enterprise and IT goals had the greatest discriminating power, with 37 processes being

statistically different for different levels of alignment.• Industry, outsourcing and IT spending had the least discriminating power. • For attributes, alignment, size and location had the greatest discriminating power.

Figure 71—Heat Map of Statistical Significance for 41 Processes

Processes Location Development Industry Size Spending Alignment Outsourcing Structure

PO1

PO2A

PO2D

PO3

PO4O

PO4P

PO5B

PO5V

PO6

PO7

PO8

PO9

PO10PG

PO10PJ

AI1

AI2

AI3

AI4

AI5

AI6

AI7

DS1

DS2

15 For the statistical analysis, ordered probit analysis or ANOVA was useful for the cutoff for designating differences as statistically significant, a p-value of .10 was used.

5. Associating IT Governance With Process Maturity

63© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Geographic Location

The 51 enterprises that participated in this study were located in the following countries or regions:• Austria/Germany/Switzerland (AGS)• Canada (CN)• Mexico (MX)• Philippines (PH)• Singapore (SG)• USA

As the location column in figure 71 indicates, the maturity levels were statistically different for 11 of the 41 processes. Figure 73 lists those processes and the average process maturity levels for the six countries or regions. In general, the averages were highest (bold) for enterprises in Canada and Singapore and lowest (italics) for enterprises in Mexico and the Philippines. The widest difference was for DS6 Identify and allocate costs; Canadian enterprises averaged 3.9 and Mexican enterprises averaged 1.2.

Figure 71—Heat Map of Statistical Significance for 41 Processes (cont.)

Processes Location Development Industry Size Spending Alignment Outsourcing Structure

DS3

DS4

DS5NF

DS5P

DS5U

DS5V

DS6

DS7

DS8

DS9

DS10

DS11

DS12

DS13

ME1

ME2

ME3

ME4

Figure 72—Heat Map of Statistical Significance for Six Attributes

Processes Location Development Industry Size Spending Alignment Outsourcing Structure

Awareness

Policies

Tools

Skills

Responsibility

Goals

Overall

IT Governance and Process Maturity

64 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

The finding that only approximately 25 percent of the process levels were statistically different amongst countries was a little surprising considering the cultural, business and regulatory environmental differences amongst these locations. There are several possible reasons for the general lack of differences, including:• IT organisations in more developed countries have reached a plateau. In the intervening years, IT operations in emerging countries

have caught up and are reaching a similar plateau, so there are fewer differences amongst the enterprises due to country-specific characteristics.

• Many hardware and software brands (e.g., IBM, SAP, Microsoft, Intel, Dell) know no borders and are universal, de facto standards, so newer IT organisations have less of a learning curve to implement technology. In addition, large IT organisations have access to the same professional organisations (e.g., ISACA), IT literature and consultants to accelerate the implementation of good practices.

• Because the number of enterprises from each country was relatively small and not taken from random samples, the sample from each country may not be a true representation of process maturity levels for that country’s IT organisations.

As figure 72 indicates, all of the attributes except skills were statistically different when analysed by location. Figure 74 lists the statistically significant attributes and the average process maturity levels for the six locations. In general, as expected based on the prior discussions, the averages were highest (bold) for Canadian and Singaporean enterprises and lowest (italics) for Mexican and Philippine enterprises. The biggest difference was for the technology attribute; Canadian enterprises averaged 2.9 and Mexican enterprises averaged 1.5.

To take a different look at the location variable, the countries were classified into two broad categories: emerging (Mexico and Philippines) and developed (the other countries). As figure 71 shows, for this dichotomy, the values for 13 processes were statistically different. Figure 75 lists the statistically significant processes and the average process maturity levels for emerging and developed countries. The averages were always higher for developed countries. The biggest difference was for DS6 Identify and allocate costs where enterprises in developed countries averaged 2.8 and the average for emerging countries was 1.8.

Figure 73—Statistically Significant Processes by Geographic Location

Processes

Location

AGS CN MX PH SG USA

AI1 Identify automated solutions. 2.4 3.7 1.6 2.2 2.9 2.5

AI5 Procure IT resources. 2.8 3.5 2.3 3.1 3.7 2.6

AI6 Manage changes. 2.3 3.1 2.1 2.5 3.5 2.7

DS5NF Ensure systems security—Network and firewall. 2.4 3.8 2.2 2.4 3.5 3.2

DS5P Ensure systems security—Policy. 2.5 3.5 2.3 2.0 3.1 2.9

DS5U Ensure systems security—User access. 2.7 3.5 1.9 2.3 3.3 3.0

DS6 Identify and allocate costs. 3.2 3.9 1.2 2.3 2.7 2.4

DS10 Manage problems. 2.0 2.3 1.9 2.6 3.7 2.7

ME2 Monitor and evaluate internal control. 1.3 3.0 1.5 1.9 2.7 2.6

ME3 Ensure compliance with external requirements. 1.4 3.4 1.1 2.3 2.7 2.4

ME4 Provide IT governance. 1.7 3.6 1.5 2.1 2.6 2.4

Figure 74—Statistically Significant Attributes by Geographic Location

Attributes

Location

AGS CN MX PH SG USA

Awareness 3.1 3.3 2.2 2.7 3.4 2.9

Policies 2.4 3.2 2.0 2.4 3.2 2.7

Technology 2.3 2.9 1.5 1.8 2.7 2.4

Responsibility 2.7 3.3 2.2 2.6 3.4 2.8

Goals 1.9 2.9 1.6 2.0 2.4 2.4

5. Associating IT Governance With Process Maturity

65© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 72 shows that all of the attributes were statistically different when comparing different levels of national development. Figure 76 lists the statistically significant attributes and the average process maturity levels for emerging and developed countries. As expected based on the prior discussions, the averages were higher for enterprises in developed countries. The biggest difference was for the technology attribute; enterprises in developed countries averaged a maturity level of 2.5 and enterprises in emerging countries averaged a level of 1.7.

Industry Classification

The survey respondents were eager to know how they compared to other enterprises. To compare enterprises by industry, the enterprises were grouped into the following broad industry classifications:• Capital-intensive industries, other than utilities (Cap)• Utilities (Util)• Service industries (Srv)• Financial institutions (Fin)• Government and non-profits (Govt)

As the industry column in figure 71 indicates, only ME3 Ensure compliance with external requirements and ME4 Provide IT governance were significantly different. In other words, even though the means may have been different for the other 39 processes amongst industry groups, they were not different enough to be statistically significant. This result was surprising because certain processes are more critical to some industries because of the characteristics of that industry. The different level and nature of government regulations in some industries would also be expected to impact IT processes. However, the statistical results imply that differences in process maturity amongst enterprises in this study are not correlated with differences in industry characteristics. In other words, any systematic differences amongst enterprises are more strongly due to variables other than industry classifications.

Figure 75—Development Level of Statistically Significant Processes

Processes Emerging Developed

PO4O Define the IT processes, organisation and relationships—Organisation. 1.7 2.5

PO6 Communicate management aims and direction. 1.8 2.6

PO7 Manage IT human resources. 2.4 2.9

PO10PJ Manage projects—Projects. 2.3 3.1

AI1 Identify automated solutions. 2.0 2.7

AI2 Acquire and maintain application software. 2.5 3.0

DS5NF Ensure systems security—Network and firewall. 2.3 3.0

DS5P Ensure systems security—Policy. 2.1 2.8

DS5U Ensure systems security—User access. 2.2 3.0

DS5V Ensure systems security—Virus. 2.6 3.2

DS6 Identify and allocate costs. 1.8 2.8

DS8 Manage service desk and incidents. 2.5 3.1

DS12 Manage the physical environment. 2.6 3.2

Figure 76—Development Level of Statistically Significant Attributes

Attributes Emerging Developing

Awareness 2.5 3.0

Policies 2.3 2.7

Technology 1.7 2.5

Skills 2.2 2.6

Responsibility 2.4 2.9

Goals 1.9 2.3

Figure 77 shows the numbers for ME3 Ensure compliance with external requirements and ME4 Provide IT governance. Financial institutions had the highest maturity levels and capital-intensive enterprises had the lowest.

When looking at the specific attributes in figure 72, awareness, policies and skills were significantly different amongst industries. Figure 78 shows the numbers for those attributes. Generally, utilities had the highest maturity levels for those attributes and capital-intensive enterprises had the lowest, but the gaps between the highest and lowest numbers were not as extreme as some of the other comparisons in this section of the report.

Size of IT Organisations

As shown earlier in this report, three characteristics were collected that reflect the size the of IT operations. Through factor analysis, those size variables were aggregated to create one size measure. Then, the 51 enterprises were classified as low or high based on whether their size was below or above the median size measure. As figure 71 shows, for this dichotomy, the values for 23 processes were statistically different. Figure 79 shows the means for those 23 processes. As would be expected, larger enterprises had higher process maturity levels. None of the means for the smaller enterprises broke above 3.0, whereas 12 of the means were 3.0 or higher for larger enterprises. Larger enterprises probably have longer histories than smaller enterprises, in general, and probably have more capital to invest in fine-tuning their IT processes.

IT Governance and Process Maturity

66 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 77—Statistically Significant Processes by Industry

Processes

Industry

Cap Util Srv Fin Govt

ME3 Ensure compliance with external requirements. 1.5 2.0 2.4 2.8 1.9

ME4 Provide IT governance. 1.4 2.5 2.2 2.6 2.3

Figure 78—Statistically Significant Attributes by Industry

Attributes

Industry

Cap Util Srv Fin Govt

Awareness 2.6 3.4 3.1 3.0 2.8

Policies 2.3 3.1 2.6 2.7 2.5

Skills 2.2 3.0 2.5 2.6 2.4

Figure 79—Statistically Significant Processes by IT Size

Processes

IT Size

Processes

IT Size

Low High Low High

PO1 Define a strategic IT plan. 2.1 2.6 AI2 Acquire and maintain application software. 2.6 3.1

PO2A Define the information architecture—Architecture. 1.9 2.4 AI3 Acquire and maintain technology infrastructure. 2.4 2.9

PO2D Define the information architecture—Data classification.

1.7 2.3 AI4 Enable operation and use. 2.1 2.7

DS3 Manage performance and capacity. 2.1 2.6

PO3 Determine technological direction. 2.0 2.7 DS4 Ensure continuous service. 2.0 3.0

PO4O Define the IT processes, organisation and relationships—Organisation.

1.9 2.7 DS5NF Ensure systems security—Network and firewall. 2.5 3.2

DS5P Ensure systems security—Policy. 2.3 3.0

PO4P Define the IT processes, organisation and relationships—Processes.

2.3 2.8 DS5U Ensure systems security—User access. 2.4 3.1

DS5V Ensure systems security—Virus. 2.6 3.5

PO5B Manage the IT investment—Budgeting. 2.7 3.3 DS8 Manage service desk and incidents. 2.6 3.3

PO6 Communicate management aims and direction. 2.1 2.7 DS11 Manage data. 2.6 3.2

PO7 Manage IT human resources. 2.5 3.1 DS12 Manage the physical environment. 2.7 3.4

PO10PG Manage projects—Programme. 2.1 2.8 DS13 Manage operations. 2.6 3.0

5. Associating IT Governance With Process Maturity

67© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 72 shows that all the attributes were statistically different, and figure 80 shows the means for those attributes. As figure 80 shows, the means for larger organisations were 20 to 30 percent higher than the means for smaller organisations.

IT Spending as a Percentage of Revenue

As a loose surrogate for the relative scope of IT operations in enterprises, data were collected on IT spending as a percentage of the enterprise’s revenue. Based on this number, enterprises were classified as low or high depending on whether the enterprise was below or above the median percentage. As figure 71 shows, the values for four processes were statistically different. As figure 81 shows, the means were always higher for enterprises with the lower percentages. At first this may seem counterintuitive, but IT spending is measured relative to revenue—not in absolute terms. That is, a low percentage does not necessarily indicate a smaller enterprise and a high percentage does not necessarily indicate a larger enterprise. In fact, the opposite (or at least a mix of relationships of percentages and size) might be true. Because IT expenditures typically have a high fixed-cost component, smaller enterprises would have higher percentages than larger enterprises to cover the relatively higher fixed costs.

As with the industry variable, these results were also surprising. Intuitively, it might be expected to see a stronger relationship between IT spending and process maturity. Part of the issue may be that percentages, not absolute expenditures, were collected, so the study is not testing whether there is a relationship between total monies spent vs. maturity. In addition, as discussed previously, for some enterprises percentages could be moving in the opposite direction of absolute spending. This would be an interesting question to address in future research. Figure 72 shows that none of the attributes was statistically different for different levels of IT spending as a percentage of revenue.

Alignment of Business and IT Goals

One of the major themes in CobiT is the importance of aligning IT goals with business goals and aligning the work of the IT function with the business. A series of questions was posed to the CIOs regarding various aspects of alignment. Answers were combined via factor analysis and, based on those results, enterprises were classified as having low alignment or high alignment. As figure 71 shows, the values for 37 processes were statistically different. Figure 82 shows that the means were always higher when the alignment between the business and IT was higher. For five of the processes (highlighted in bold type), the difference between the means was one level or more.

Figure 80—Statistically Significant Attributes by IT Size

Attributes

IT Size

Low High

Awareness 2.6 3.2

Policies 2.3 2.9

Technology 2.0 2.5

Skills 2.3 2.7

Responsibility 2.5 3.0

Goals 1.9 2.5

Figure 81—Statistically Significant Processes by IT Spending

Processes

IT Spending as a Percent of Revenue

Low High

PO5V Manage the IT investment—Value management. 2.4 1.7

PO10PG Manage projects—Programme. 2.8 1.9

DS5NF Ensure systems security—Network and firewall. 3.5 2.8

DS5P Ensure systems security—Policy. 3.2 2.6

IT Governance and Process Maturity

68 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 82—Statistically Significant Processes by Business and IT Alignment

Processes

Levels of Alignment

Low High

PO1 Define a strategic IT plan. 2.1 2.6

PO2A Define the information architecture—Architecture. 1.8 2.5

PO2D Define the information architecture—Data classification. 1.6 2.4

PO3 Determine technological direction. 1.9 2.7

PO4O Define the IT processes—Organisation. 1.9 2.7

PO4P Define the IT processes—Processes. 2.2 2.9

PO5V Manage the IT investment—Value management. 1.5 2.2

PO6 Communicate management aims and direction. 2.0 2.8

PO7 Manage IT human resources. 2.4 3.1

PO8 Manage quality. 1.6 2.7

PO10PJ Manage projects—Projects. 2.6 3.2

AI1 Identify automated solutions. 2.1 2.8

AI2 Acquire and maintain application software. 2.5 3.2

AI3 Acquire and maintain technology infrastructure. 2.3 3.0

AI4 Enable operation and use. 2.1 2.7

AI5 Procure IT resources. 2.5 3.2

AI6 Manage changes. 2.2 3.1

AI7 Install and accredit solutions and changes. 2.1 3.0

DS1 Define and manage service levels. 1.6 2.6

DS2 Manage third-party services. 2.0 2.8

DS3 Manage performance and capacity. 2.0 2.8

DS4 Ensure continuous service. 1.9 3.1

DS5NF Ensure systems security—Network and firewall. 2.5 3.2

DS5P Ensure systems security—Policy. 2.3 3.0

DS5U Ensure systems security—User access. 2.4 3.1

DS5V Ensure systems security—Virus. 2.7 3.4

DS7 Educate and train users. 2.0 3.0

DS8 Manage service desk and incidents. 2.7 3.2

DS9 Manage the configuration. 2.0 2.7

DS10 Manage problems. 2.0 2.9

DS11 Manage data. 2.4 3.3

DS12 Manage the physical environment. 2.8 3.3

DS13 Manage operations. 2.5 3.0

ME1 Monitor and evaluate IT performance. 1.7 2.5

ME2 Monitor and evaluate internal control. 1.7 2.5

ME3 Ensure compliance with external requirements. 1.7 2.5

ME4 Provide IT governance. 1.7 2.7

5. Associating IT Governance With Process Maturity

69© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 72 shows that all of the attributes were statistically different, and figure 83 shows that the means were always higher for the organisations with high business/IT alignment. In terms of percentage differences, the goals attribute for the high-alignment group was 46 percent higher than it was for the low-alignment group.

Level of Outsourcing

The organisations were evenly divided into two groups based on their relative level (low vs. high) of outsourcing. As figure 71 shows, only the values for three processes were statistically different. Figure 84 shows that the means for those three processes were mixed, with PO5B and DS6 (both of which relate to IT budgeting and costs) more mature for low-level outsourcers and AI3 more mature for high-level outsourcers.

As figure 72 indicates, none of the attributes was statistically different for low-level and high-level outsourcers.

IT Governance Structure

Based on the information provided by CIOs, IT governance structures were classified as centralised, decentralised and federal. As shown in figure 71, four processes were statistically different when considering the organisation of the IT function. As figure 85 shows, except for DS12, the means were higher for enterprises with decentralised IT. Enterprises with federal structures had the lowest means for the four processes.

Figure 83—Statistically Significant Attributes by Business and IT Alignment

Processes

Level of Alignment

Low High

Awareness 2.6 3.2

Policies 2.2 3.0

Technology 2.0 2.5

Skills 2.1 2.8

Responsibility 2.3 3.1

Goals 1.8 2.6

Figure 84—Statistically Significant Processes by Level of Outsourcing

Processes

Level of Outsourcing

Low High

PO5B Manage the IT investment—Budgeting. 3.4 2.8

AI3 Acquire and maintain technology infrastructure. 2.4 2.8

DS6 Identify and allocate costs. 3.1 2.3

Figure 85—Statistically Significant Processes by IT Governance Structure

Processes

IT Governance Structure

Centralised Decentralised Federal

AI6 Manage changes. 2.7 3.0 2.1

DS5U Ensure systems security—User access. 2.7 3.3 2.7

DS11 Manage data. 3.0 3.3 2.4

DS12 Manage the physical environment. 3.3 2.9 2.5

IT Governance and Process Maturity

70 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 72 indicates that all of the attributes were statistically different for the different structures. As figure 86 shows, the means were always highest for organisations with decentralised structures.

Concluding Comments on Statistical Analysis

As mentioned earlier, there were some expectations before performing the statistical analysis as to where there would likely be differences. For example, because cultures, management practices and regulatory environments vary so much amongst countries, it was expected that process maturities would also vary widely. After doing the analysis, it was found that only about 25 percent of the processes were statistically different. One of the biggest surprises was the industry comparison, in which only two processes were statistically different. On the other side, an intriguing result was the discriminating power of alignment; 37 processes and all attributes were statistically different.

All of these findings should be considered preliminary. The fact that 51 CIOs were willing to take time out from their busy schedules to be interviewed and that their top managers also took time to participate was unprecedented. Taken as a whole, the 51 enterprises provided a diversified sample. However, they were all volunteers—not a random sample from the population of interest—so care must be taken when generalising the results of this study. Subdividing the 51 enterprises into smaller groups to conduct between-group statistical analysis also put some limitations on the ability to generalise these results. Additional research will be needed to build on these preliminary results.

Figure 86—Statistically Significant Attributes by IT Governance Structure

Attributes

IT Governance Structure

Centralised Decentralised Federal

Awareness 3.0 3.2 2.7

Policies 2.7 3.1 2.3

Technology 2.3 2.8 2.1

Skills 2.6 2.7 2.3

Responsibility 2.8 2.9 2.6

Goals 2.2 2.5 2.0

5. Associating IT Governance With Process Maturity

71© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

6. Next Steps in the Self-assessment ProcessAs stated in the introduction to this report, one of the most common questions the participating IT managers had was how their organisation’s performance of IT operations compared to their peers. This question permeates the entire organisation. In addition to IT staff members and IT auditors, C-level executives want to know the answer because IT is an extensive investment for the organisation and is often a critical component in the enterprise’s value generation. This chapter provides broad guidance on how to conduct a process maturity self-assessment so readers of this report can compare their own process maturity numbers to their peers. In addition, the section also describes how to develop an action plan based on the results of the self-assessment. This methodology is broadly in line with key aspects of ITGI’s IT Governance Implementation Guide: Using CobiT and Val IT, 2nd Edition.

Collecting Data

Chapter 2 of this report explained how data were collected for this research study and how CobiT and the generic process maturity attribute table were used in assessing the levels of process maturity in IT organisations. These IT functions were made up of both CobiT adopters and enterprises that did not use CobiT. The same techniques can be used for self-assessment.

An early task is to decide the processes on which to collect data on maturity levels. Maturity level data were collected for all the processes in CobiT. This seems an appropriate approach for the early stages of rollout of an IT governance initiative. Such an assessment is part of a scoping exercise, as envisaged in phase 1 (identify needs) of the IT Governance Implementation Guide. By contrast, collection of detailed maturity data at the process level seems more appropriate later in the adoption process, such as at stage six of phase 2 (envision solution). At this stage, only a select group of processes are subject to much more in-depth analysis. In other words, the approach used, which applied the standard maturity attribute table to all processes, seems to work well early in the process of building IT governance. Later, in the rollout of IT governance, it is more appropriate to use the very detailed process-level maturity set for each of the processes in CobiT. This latter assessment should be done only for high-priority processes.

The key to self-assessment using the standard maturity attribute table is the quality of the collected data. To help ensure the accuracy of the collected data, the sources of the data (the people providing the maturity levels) must be knowledgeable so they can ascertain which descriptions in figure 89 (see appendix 1) best match their current process maturity levels. A first task is to match the CobiT processes to process owners. CobiT includes RACI (responsible, accountable, consulted and informed) charts for the key activities in each process. Matching these RACI charts to the current responsibilities of managers in IT and the business will assist in identifying process owners.

It is equally important for the sources of process maturity information to be candid in their responses. They must be assured that low numbers will not reflect badly on them. As explained in the methods discussion, it helps to ask challenge questions (e.g., Can you give me an example of…?) to ensure that the source’s numbers are accurate.

The exact self-assessment data collection process will vary by enterprise. Also, the person or group tasked to collect the data will vary. For example, IT staff or the IT internal auditor could be assigned the primary responsibility to collect the data, or it may be appropriate to hire a consultant. Whatever approach is employed, the basic data collection should include the following general steps:1. Create a spreadsheet that lists the 41 IT processes as rows and the six attributes as columns. For each process, list the possible

sources of the process maturity numbers. For data collection, the sources were CIOs and top managers. It is not mandatory that these people be the sources of process maturity numbers, but the sources must have a deep understanding of the processes for which they are providing data. Since any one source might have from three to 10 processes for which he/she is responsible, the list of processes would generally have a total of three to five different names. Similarly, some processes will have multiple owners.

2. Schedule interviews with the people identified in step 1. Depending on the number of processes associated with each person, each interview usually requires 45 to 60 minutes and, exceptionally, up to a couple of hours.

3. At the start of each interview, provide a brief description of the project and the protocol for the interview. Give the interviewee a copy of the generic process maturity attribute table and figure 89 in appendix 1.

4. Taking one process at a time, introduce the process control objectives and summarise the detailed control objective for the interviewee. Ask the interviewee to read the attribute descriptions in the process maturity attribute table and then correlate the process maturity level for each of the six attributes with the state of the particular enterprise’s maturity for that process. The interviewee should then state out loud the maturity level for each of the six attributes. The interviewer would record the interviewee’s responses. Allow the interviewee to select a level in between the given discrete levels of process maturity (e.g., 1.6, 2.4).

IT Governance and Process Maturity

72 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

There may be a suggestion to give the spreadsheet to the interviewees and let them complete the spreadsheet over a period of time and return it. This approach is not recommended for two reasons. First, they are likely to put the spreadsheet aside and take a long time to complete it. Second, and more important, they may overthink the answers. Their first responses are desired. If they think about it, they may raise some of the numbers because they worry about the political aspects of their answers. As said before, to help ensure the quality of the process maturity levels responses, seek additional information and challenge the interviewee on a subset of the processes. When the interviewee is being interviewed on several processes, it is particularly important to ensure that the maturity levels are measured correctly early in the data collection process. Examples of validation questions include ‘How is management’s awareness of this process communicated to the IT organisation?’ and ‘What are some of the tools and technologies supporting this process?’ An established pattern is then used for the other processes. This registration and validation process has to be handled judiciously; data collection could take an unacceptable amount of time if every data point is challenged.

Comparing Results

Steps 5 through 7 deal with comparing an organisation’s results to the benchmark data included in this report:5. Load the numbers (at least the medians) from figure 90 in appendix 2 into the spreadsheet created previously. It is NOT

recommended to load the numbers first because the interviewees may be tempted to match those numbers.6. If all of the enterprise’s numbers comfortably exceed those in the spreadsheet, it can be a role model for other IT operations. If that

is not the situation, the next step is to drill down into the data to help pinpoint specific areas in need of improvement. For example, compare the enterprise’s results with the industry, geographic location and other figures in appendix 2 to determine where the enterprise is below its peers for these different processes and specific attributes within processes.

7. Make a list of process/attribute combinations where the enterprise is below its peers. Prioritise the list of process/attribute combinations based on a risk assessment. The process/attribute combinations that are more critical to the enterprise should be at the top of the list and should be the first to receive additional attention. This is directly analogous to step 7.1 ‘Determine target capability maturity’ in phase 2 (envision solution), in ITGI’s IT Governance Implementation Guide.

Improving Process Maturity Levels

Step 8 deals with improving maturity levels:8. Set target process/attribute levels for the process/attribute combinations that the enterprise wants to improve, read the

characteristics of those targeted levels from the descriptions in figure 70, and determine what actions are needed to achieve the new process maturity levels. Guidance on these actions is given in phase 3 (plan solution), phase 4 (implement solution) and phase 5 (operationalise solution) in ITGI’s IT Governance Implementation Guide.

6. Next Steps in the Self-assessment Process

73© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

7. ConclusionThis project set out to achieve the following objectives:• Collect process maturity data from a wide variety of organisations to develop preliminary benchmarks for each maturity

attribute/IT process combination.• Collect IT demographics to perform an initial analysis of process maturity measures vs. IT demographics as a starting point for

benchmarking profiles for different demographic combinations.• Provide guidance for enterprises to conduct their own self-assessment to compare their process maturity measures

with the benchmarks.

Through the help of many people, the team was able to locate 51 CIOs and their top managers from around the world who were willing to take time from their busy schedules to meet and provide a robust data set to achieve the first two objectives. A concern going into this project was that these senior-ranking people would be reluctant to give the researchers their candid self-assessments. Another concern was that this would be particularly apparent when some of those maturity assessments were very low. However, the number of level 0 and level 1 responses collected supports the conclusion that the participants were candid in their responses.

In terms of the relative maturity levels for the 41 processes, figure 87 illustrates the findings in the form of a heat map based on the means (averages) of the responses from the 51 enterprises. The red cells represent the top third maturity levels, the pink cells represent the middle third, and the white cells represent the bottom third. Comparing the rows (processes) first, PO5B Manage the IT investment—Budgeting is red (in the highest third) across all six attributes, indicating the highest level of maturity. Most large enterprises, including those in this study, have a long history of having a formal budgeting process for the entire enterprise. Therefore, it is not a surprise that the budget process has this high level of maturity. The enterprise-wide process is probably highly structured with routine processes and budget templates to be completed each year. PO5B was the only process that was red for all six attributes, but several processes were at the highest level for five attributes and medium for the goals attribute, including:• DS5NF Ensure systems security—Network and firewall• DS5V Ensure systems security—Virus• DS8 Manage service desk and incidents• DS11 Manage data• DS12 Manage the physical environment

It is interesting to note that these processes are all in the Deliver and Support (DS) domain. In many ways, these are daily, operational issues. An enterprise’s networks are constantly under attack from the outside, so firewall (DS5NF) and virus (DS5V) processes must be mature or the enterprise will quickly find its networks infected and shut down. Service desks (DS8) are being inundated with questions and incidents from users—sometimes 24/7. If this service function is not operating at a high level of efficiency, the productivity of the entire enterprise will suffer. This process has benefited from ITIL adoption by several of the enterprises in the study. The management of data and media (DS11) and the physical environment (DS12) have a long history, going back to major control issues associated with mainframe computers. These types of processes and associated problems are highly visible and well understood.

At the other end of the spectrum, PO2D Define the information architecture—Data classification and PO5V Manage the IT investment—Value management are white (in the lowest third) for all six attributes. A reason why PO2D is at the low end may be that the process of classifying data requires integration of different technology and application platforms with roles and responsibilities covering both IT and the business. PO5V is the process of determing the question ‘How does IT add value to the enterprise?’ Management of value generation by IT is challenging and multidimensional. The formal results reinforced by discussions with IT mangers during data collection indicate that there is little emphasis on answering these value questions in most enterprises.

Moving up only slightly, two processes were in the lowest third for five attributes: PO8 Manage quality and DS1 Define and manage service levels. The low level of these two processes was unexpected considing the emphasis given to quality control in IT literature. It indicates that these two processes have not moved much past the ad hoc level.

A review of the columns in figure 87 (figure 5 repeated here for convenience) indicates the general maturity levels for the six attributes; the extremes are quite dramatic. The awareness attribute was in the top third for 28 (68 percent) of the 41 processes and was in the lowest third for only two (5 percent) of the 41 processes. On the other hand, the goals attribute was essentially the mirror image of the awareness attribute; it was in the lowest third for 28 (68 percent) of the processes and in the top third for only one (2 percent) of the processes. Clearly, the setting of goals and metrics is a challenging task for many IT organisations. The finding for the goals attribute matched the relatively low level of maturity for each of the processes in the Monitor and Evaluate (ME) domain.

IT Governance and Process Maturity

74 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

74 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 87—Summary Heat Map

Processes Awareness Policies Technology Skills Responsibility Goals

PO1 Define a strategic IT plan.

PO2A Define the information architecture—Architecture.

PO2D Define the information architecture—Data classification.

PO3 Determine technological direction.

PO4O Define the IT processes, organisation and relationships—Organisation.

PO4P Define the IT processes, organisation and relationships—Processes.

PO5B Manage the IT investment—Budgeting.

PO5V Manage the IT investment—Value management.

PO6 Communicate management aims and direction.

PO7 Manage IT human resources.

PO8 Manage quality.

PO9 Assess and manage IT risks.

PO10PG Manage projects—Programme.

PO10PJ Manage projects—Projects.

AI1 Identify automated solutions.

AI2 Acquire and maintain application software.

AI3 Acquire and maintain technology infrastructure.

AI4 Enable operation and use.

AI5 Procure IT resources.

AI6 Manage changes.

AI7 Install and accredit solutions and changes.

DS1 Define and manage service levels.

DS2 Manage third-party services.

DS3 Manage performance and capacity.

DS4 Ensure continuous service.

DS5NF Ensure systems security—Network and firewall.

DS5P Ensure systems security—Policy.

DS5U Ensure systems security—User access.

DS5V Ensure systems security—Virus.

DS6 Identify and allocate costs.

DS7 Educate and train users.

DS8 Manage service desk and incidents.

DS9 Manage the configuration.

DS10 Manage problems.

DS11 Manage data.

DS12 Manage the physical environment.

DS13 Manage operations.

ME1 Monitor and evaluate IT performance.

ME2 Monitor and evaluate internal control.

ME3 Ensure compliance with external requirements.

ME4 Provide IT governance.

7. Conclusion

The other relatively high-level attribute was responsibility, which was in the top third for 51 percent of the processes. The other relatively low attribute was technology, which was in the bottom third for 56 percent of the processes.

The technology attribute essentially addresses the level to which tools and automation are used to manage the enterprise’s technology. It is interesting that the use of technology is not more mature in the technology domain.

From 0 to 5 and Everything in Between

The wide distribution of responses was intriguing considering that all of the 51 enterprises were mature (they generally have a long history). Figure 88 (identical to figure 7, but duplicated here for convenience) shows the distribution of maturity levels for the six attributes. For the technology and goals attributes, 3.0 and 2.8 percent, respectively, indicated level 0. Combining levels 0 and 1 shows that 28.5 and 28.0 percent of enterprises selected levels 0 and 1 for the technology and goals attributes, respectively. These were followed by 14.5 percent for the policies attribute and 14.3 percent for the skills attribute. On the other hand, many enterprises were operating at levels 4 and 5. At the high end, 36.5 percent of enterprises selected 4 or 5 for the awareness attribute and 30.1 percent selected 4 or 5 for the responsibilities attribute. Even for the attributes with high frequencies of levels 0 and 1 (technology and goals), the frequencies of levels 4 and 5 were respectable: 21.9 percent for the technology attribute and 17.7 percent for the goals attribute.

While figure 88 summarises the levels at the attribute level, looking at the levels from a process perspective gives similar mixed results. For 33 processes, at least one person stated a level 0 for at least one attribute. The highest frequency of level 0 was five (out of 51 enterprises) for the technology attribute for PO4P and four for the technology attribute for PO3, PO5V and ME3 and for the goals attribute for PO2D and ME2.

For the non-zero levels (levels 1 through 5), every process had at least one of each of those levels.

This information shows that maturity levels of 4 and 5 are achievable. On the other hand, because of the low levels for some processes and specific attributes in some processes, the first reaction might be to say that enterprises should focus more resources on those processes and attributes to increase their maturity levels. However, one could argue that the levels of any of these processes evolved over time to their sufficient (or adequate or satisfactory) level. This is called a satisficing strategy, where the goal is to achieve an adequate level as opposed to an optimum level. Sometimes it can be very difficult to define and measure what an optimum level would be, and sometimes a satisficing strategy can come close to an optimising strategy. The satisficing approach could be considered a corollary to the squeaky wheel cliché in that management will stop applying oil when the squeaking stops and move on to the next squeaky wheel. This is not to promote this strategy; it is simply saying that satisficing appears to be the dominant strategy for many enterprises. In the extreme, this strategy is pejoratively called the fire-fighting strategy.

This study does not say that every enterprise has achieved the appropriate levels for every process no matter the current levels. Instead, it is revisiting the point made at the beginning of the report that, at an intutive level, enterprises cannot justify the costs of pushing everything to a level 5. However, since levels 4 and 5 are acheivable by a wide cross-section of enterprises, that still leaves the question, ‘At what levels should the industry/enterprise be?’

This project achieved the research objective of developing robust benchmark information and providing a means for enterprises to answer the question, ‘How do we compare with our peers?’ Future research can build on this research to answer the more normative question, ‘At what levels should we be?’ By conducting focus groups, case studies and surveys, deeper and wider data can be collected to move from a satisficing strategy to an optimising strategy.

75© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 88—Distribution of Maturity Levels

Attributes

Maturity Levels

0 1 2 3 4 5

Awareness 0.4% 8.3% 22.3% 32.6% 29.0% 7.5%

Policies 0.8% 13.7% 30.6% 27.5% 21.0% 6.3%

Technology 3.0% 25.5% 26.6% 23.0% 17.4% 4.5%

Skills 0.3% 14.0% 32.7% 32.2% 16.7% 4.1%

Responsibilities 0.4% 10.9% 26.0% 32.7% 23.2% 6.9%

Goals 2.8% 25.2% 29.9% 24.4% 14.9% 2.8%

IT Governance and Process Maturity

76 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 8

9—M

atur

ity A

ttrib

ute

Tabl

eAw

aren

ess

and

Com

mun

icat

ion

Polic

ies,

Pla

ns

and

Proc

edur

es

Tool

s an

d Au

tom

atio

nSk

ills

and

Expe

rtis

eRe

spon

sibi

lity

and

Acco

unta

bilit

yGo

al S

ettin

g an

d M

easu

rem

ent

1 Re

cogn

ition

of t

he n

eed

for

the

proc

ess

is em

ergi

ng.

Ther

e is

spor

adic

co

mm

unic

atio

n of

the

iss

ues.

Ther

e ar

e ad

hoc

app

roac

hes

to

proc

esse

s an

d pr

actic

es.

The

proc

ess

and

polic

ies

ar

e un

defin

ed.

Som

e to

ols

may

exi

st;

usag

e is

bas

ed o

n

stan

dard

des

ktop

tool

s.

Ther

e is

no

plan

ned

ap

proa

ch to

the

tool

usa

ge.

Skill

s re

quire

d fo

r the

pr

oces

s ar

e no

t ide

ntifi

ed.

A tra

inin

g pl

an d

oes

not

exis

t and

no

form

al tr

aini

ng

occu

rs.

Ther

e is

no

defin

ition

of

acco

unta

bilit

y an

d

resp

onsi

bilit

y. P

eopl

e ta

ke

owne

rshi

p of

issu

es b

ased

on

thei

r ow

n in

itiat

ive

on a

re

activ

e ba

sis.

Goal

s ar

e no

t cle

ar a

nd n

o

mea

sure

men

t tak

es p

lace

.

2 Th

ere

is a

war

enes

s of

the

need

to a

ct.

Man

agem

ent c

omm

unic

ates

th

e ov

eral

l iss

ues.

Sim

ilar a

nd c

omm

on

proc

esse

s em

erge

, but

are

la

rgel

y in

tuiti

ve b

ecau

se o

f in

divi

dual

exp

ertis

e.

Som

e as

pect

s of

the

proc

ess

ar

e re

peat

able

bec

ause

of

indi

vidu

al e

xper

tise,

and

som

e

docu

men

tatio

n an

d in

form

al

unde

rsta

ndin

g of

pol

icy

and

pr

oced

ures

may

exi

st.

Com

mon

app

roac

hes

to

use

of to

ols

exis

t but

are

ba

sed

on s

olut

ions

de

velo

ped

by k

ey

indi

vidu

als.

Ve

ndor

tool

s m

ay h

ave

be

en a

cqui

red,

but

are

pr

obab

ly n

ot a

pplie

d

corre

ctly

, and

may

eve

n

be s

helfw

are.

Min

imum

ski

ll re

quire

men

ts

are

iden

tifie

d fo

r crit

ical

ar

eas.

Tr

aini

ng is

pro

vide

d in

re

spon

se to

nee

ds, r

athe

r th

an o

n th

e ba

sis

of a

n

agre

ed p

lan,

and

info

rmal

tra

inin

g on

the

job

occu

rs.

An in

divi

dual

ass

umes

hi

s/he

r res

pons

ibili

ty a

nd is

us

ually

hel

d ac

coun

tabl

e,

even

if th

is is

not

form

ally

ag

reed

. The

re is

con

fusi

on

abou

t res

pons

ibili

ty w

hen

pr

oble

ms

occu

r, an

d a

cu

lture

of b

lam

e te

nds

to

exi

st.

Som

e go

al s

ettin

g oc

curs

; so

me

finan

cial

mea

sure

s ar

e

esta

blis

hed

but a

re k

now

n on

ly

by s

enio

r man

agem

ent.

Ther

e

is in

cons

iste

nt m

onito

ring

in

isol

ated

are

as.

3 Th

ere

is u

nder

stan

ding

of

the

need

to a

ct.

Man

agem

ent i

s m

ore

form

al

and

stru

ctur

ed in

its

co

mm

unic

atio

n.

Usag

e of

goo

d pr

actic

es

emer

ges.

Th

e pr

oces

s, p

olic

ies

and

pr

oced

ures

are

def

ined

and

do

cum

ente

d fo

r all

key

ac

tiviti

es.

A pl

an h

as b

een

defin

ed

for u

se a

nd s

tand

ardi

satio

n

of to

ols

to a

utom

ate

the

pr

oces

s.

Tool

s ar

e be

ing

used

for

thei

r bas

ic pu

rpos

es, b

ut

may

not

all

be in

ac

cord

ance

with

the

agre

ed

plan

, and

may

not

be

in

tegr

ated

with

one

ano

ther

.

Skill

requ

irem

ents

are

def

ined

an

d do

cum

ente

d fo

r all

area

s.

A fo

rmal

trai

ning

pla

n ha

s

been

dev

elop

ed, b

ut fo

rmal

tra

inin

g is

stil

l bas

ed o

n

indi

vidu

al in

itiat

ives

.

Proc

ess

resp

onsi

bilit

y an

d

acco

unta

bilit

y ar

e de

fined

an

d pr

oces

s ow

ners

hav

e

been

iden

tifie

d. T

he p

roce

ss

owne

r is

unlik

ely

to h

ave

th

e fu

ll au

thor

ity to

exe

rcis

e

the

resp

onsi

bilit

ies.

Som

e ef

fect

iven

ess

goal

s an

d

mea

sure

s ar

e se

t, bu

t are

not

co

mm

unic

ated

, and

ther

e is

a

clea

r lin

k to

bus

ines

s go

als.

M

easu

rem

ent p

roce

sses

em

erge

, but

are

not

con

sist

ently

ap

plie

d. IT

bal

ance

d sc

orec

ard

id

eas

are

bein

g ad

opte

d, a

s is

oc

casi

onal

intu

itive

app

licat

ion

of

root

cau

se a

naly

sis.

4 Th

ere

is u

nder

stan

ding

of

the

full

requ

irem

ents

. M

atur

e co

mm

unic

atio

n

tech

niqu

es a

re a

pplie

d an

d

stan

dard

com

mun

icat

ion

to

ols

are

in u

se.

The

proc

ess

is s

ound

and

co

mpl

ete;

inte

rnal

bes

t pr

actic

es a

re a

pplie

d.

All a

spec

ts o

f the

pro

cess

ar

e do

cum

ente

d an

d

repe

atab

le. P

olic

ies

have

be

en a

ppro

ved

and

sign

ed o

ff on

by

man

agem

ent.

Stan

dard

s

for d

evel

opin

g an

d

mai

ntai

ning

the

proc

esse

s

and

proc

edur

es a

re a

dopt

ed

and

follo

wed

.

Tool

s ar

e im

plem

ente

d

acco

rdin

g to

a

stan

dard

ised

plan

, and

so

me

have

bee

n

inte

grat

ed w

ith o

ther

re

late

d to

ols.

To

ols

are

bein

g us

ed in

m

ain

area

s to

aut

omat

e

man

agem

ent o

f the

pr

oces

s an

d m

onito

r cr

itica

l act

ivitie

s an

d

cont

rols.

Skill

requ

irem

ents

are

ro

utin

ely

upda

ted

for a

ll ar

eas,

pr

ofic

ienc

y is

ens

ured

fo

r all

criti

cal a

reas

, and

ce

rtific

atio

n is

enc

oura

ged.

M

atur

e tra

inin

g te

chni

ques

ar

e ap

plie

d ac

cord

ing

to th

e

train

ing

plan

, and

kno

wle

dge

sh

arin

g is

enc

oura

ged.

All

in

tern

al d

omai

n ex

perts

are

in

volv

ed, a

nd th

e ef

fect

iven

ess

of

the

train

ing

plan

is a

sses

sed.

Proc

ess

resp

onsi

bilit

y an

d

acco

unta

bilit

y ar

e ac

cept

ed

and

wor

king

in a

way

that

en

able

s a

proc

ess

owne

r to

fully

dis

char

ge h

is/h

er

resp

onsi

bilit

ies.

A re

war

d

cultu

re is

in p

lace

that

m

otiv

ates

pos

itive

act

ion.

Effic

ienc

y an

d ef

fect

iven

ess

ar

e m

easu

red

and

co

mm

unic

ated

and

link

ed to

bu

sine

ss g

oals

and

the

IT

stra

tegi

c pl

an. T

he IT

bal

ance

d sc

orec

ard

is im

plem

ente

d

in s

ome

area

s w

ith e

xcep

tions

no

ted

by m

anag

emen

t and

root

ca

use

anal

ysis

is b

eing

st

anda

rdis

ed. C

ontin

uous

im

prov

emen

t is

emer

ging

.

5 Th

ere

is a

dvan

ced,

fo

rwar

d-lo

okin

g

unde

rsta

ndin

g of

re

quire

men

ts.

Proa

ctiv

e co

mm

unic

atio

n

of is

sues

bas

ed o

n tre

nds

ex

ists

, mat

ure

com

mun

icat

ion

te

chni

ques

are

app

lied,

and

in

tegr

ated

com

mun

icat

ion

to

ols

are

in u

se.

Exte

rnal

bes

t pra

ctic

es a

nd

stan

dard

s ar

e ap

plie

d.

Proc

ess

docu

men

tatio

n is

ev

olve

d to

aut

omat

ed

wor

kflo

ws.

Pro

cess

es,

polic

ies

and

proc

edur

es a

re

stan

dard

ised

and

inte

grat

ed

to e

nabl

e en

d-to

-end

m

anag

emen

t and

im

prov

emen

t.

Stan

dard

ised

tool

set

s ar

e

used

acr

oss

the

ente

rpris

e.

Tool

s ar

e fu

lly in

tegr

ated

w

ith o

ther

rela

ted

tool

s to

en

able

end

-to-

end

su

ppor

t of t

he p

roce

sses

. To

ols

are

bein

g us

ed to

su

ppor

t im

prov

emen

t of t

he

proc

ess

and

auto

mat

ical

ly

dete

ct c

ontro

l exc

eptio

ns.

The

orga

nisa

tion

form

ally

en

cour

ages

con

tinuo

us

impr

ovem

ent o

f ski

lls, b

ased

on

cle

arly

defin

ed p

erso

nal

and

orga

nisa

tiona

l goa

ls.

Trai

ning

and

edu

catio

n

supp

ort e

xter

nal b

est p

ract

ices

and

use

of le

adin

g-ed

ge

conc

epts

and

tech

niqu

es.

Know

ledge

sha

ring

is an

ent

erpr

ise

cultu

re, a

nd k

now

ledg

e-ba

sed

syst

ems

are

bein

g de

ploy

ed.

Exte

rnal

exp

erts

and

indu

stry

le

ader

s ar

e us

ed fo

r gui

danc

e.

Proc

ess

owne

rs a

re

empo

wer

ed to

mak

e de

cisi

ons

and

take

act

ion.

Th

e ac

cept

ance

of

resp

onsi

bilit

y ha

s be

en

casc

aded

dow

n th

roug

hout

th

e or

gani

satio

n in

a

cons

iste

nt fa

shio

n.

Ther

e is

an

inte

grat

ed

perfo

rman

ce m

easu

rem

ent

syst

em li

nkin

g IT

per

form

ance

to

bus

ines

s go

als

by g

loba

l ap

plic

atio

n of

the

IT b

alan

ced

sc

orec

ard.

Exc

eptio

ns a

re

glob

ally

and

con

sist

ently

not

ed

by m

anag

emen

t and

root

ca

use

anal

ysis

is a

pplie

d.

Cont

inuo

us im

prov

emen

t is

a

way

of l

ife.

Appendix 1—Maturity Attribute Table(Source: Figure 15, CobiT 4.1)

Appendix 2—Details of Maturity Levels

77© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 8

9—M

atur

ity A

ttrib

ute

Tabl

eAw

aren

ess

and

Com

mun

icat

ion

Polic

ies,

Pla

ns

and

Proc

edur

es

Tool

s an

d Au

tom

atio

nSk

ills

and

Expe

rtis

eRe

spon

sibi

lity

and

Acco

unta

bilit

yGo

al S

ettin

g an

d M

easu

rem

ent

1 Re

cogn

ition

of t

he n

eed

for

the

proc

ess

is em

ergi

ng.

Ther

e is

spor

adic

co

mm

unic

atio

n of

the

iss

ues.

Ther

e ar

e ad

hoc

app

roac

hes

to

proc

esse

s an

d pr

actic

es.

The

proc

ess

and

polic

ies

ar

e un

defin

ed.

Som

e to

ols

may

exi

st;

usag

e is

bas

ed o

n

stan

dard

des

ktop

tool

s.

Ther

e is

no

plan

ned

ap

proa

ch to

the

tool

usa

ge.

Skill

s re

quire

d fo

r the

pr

oces

s ar

e no

t ide

ntifi

ed.

A tra

inin

g pl

an d

oes

not

exis

t and

no

form

al tr

aini

ng

occu

rs.

Ther

e is

no

defin

ition

of

acco

unta

bilit

y an

d

resp

onsi

bilit

y. P

eopl

e ta

ke

owne

rshi

p of

issu

es b

ased

on

thei

r ow

n in

itiat

ive

on a

re

activ

e ba

sis.

Goal

s ar

e no

t cle

ar a

nd n

o

mea

sure

men

t tak

es p

lace

.

2 Th

ere

is a

war

enes

s of

the

need

to a

ct.

Man

agem

ent c

omm

unic

ates

th

e ov

eral

l iss

ues.

Sim

ilar a

nd c

omm

on

proc

esse

s em

erge

, but

are

la

rgel

y in

tuiti

ve b

ecau

se o

f in

divi

dual

exp

ertis

e.

Som

e as

pect

s of

the

proc

ess

ar

e re

peat

able

bec

ause

of

indi

vidu

al e

xper

tise,

and

som

e

docu

men

tatio

n an

d in

form

al

unde

rsta

ndin

g of

pol

icy

and

pr

oced

ures

may

exi

st.

Com

mon

app

roac

hes

to

use

of to

ols

exis

t but

are

ba

sed

on s

olut

ions

de

velo

ped

by k

ey

indi

vidu

als.

Ve

ndor

tool

s m

ay h

ave

be

en a

cqui

red,

but

are

pr

obab

ly n

ot a

pplie

d

corre

ctly

, and

may

eve

n

be s

helfw

are.

Min

imum

ski

ll re

quire

men

ts

are

iden

tifie

d fo

r crit

ical

ar

eas.

Tr

aini

ng is

pro

vide

d in

re

spon

se to

nee

ds, r

athe

r th

an o

n th

e ba

sis

of a

n

agre

ed p

lan,

and

info

rmal

tra

inin

g on

the

job

occu

rs.

An in

divi

dual

ass

umes

hi

s/he

r res

pons

ibili

ty a

nd is

us

ually

hel

d ac

coun

tabl

e,

even

if th

is is

not

form

ally

ag

reed

. The

re is

con

fusi

on

abou

t res

pons

ibili

ty w

hen

pr

oble

ms

occu

r, an

d a

cu

lture

of b

lam

e te

nds

to

exi

st.

Som

e go

al s

ettin

g oc

curs

; so

me

finan

cial

mea

sure

s ar

e

esta

blis

hed

but a

re k

now

n on

ly

by s

enio

r man

agem

ent.

Ther

e

is in

cons

iste

nt m

onito

ring

in

isol

ated

are

as.

3 Th

ere

is u

nder

stan

ding

of

the

need

to a

ct.

Man

agem

ent i

s m

ore

form

al

and

stru

ctur

ed in

its

co

mm

unic

atio

n.

Usag

e of

goo

d pr

actic

es

emer

ges.

Th

e pr

oces

s, p

olic

ies

and

pr

oced

ures

are

def

ined

and

do

cum

ente

d fo

r all

key

ac

tiviti

es.

A pl

an h

as b

een

defin

ed

for u

se a

nd s

tand

ardi

satio

n

of to

ols

to a

utom

ate

the

pr

oces

s.

Tool

s ar

e be

ing

used

for

thei

r bas

ic pu

rpos

es, b

ut

may

not

all

be in

ac

cord

ance

with

the

agre

ed

plan

, and

may

not

be

in

tegr

ated

with

one

ano

ther

.

Skill

requ

irem

ents

are

def

ined

an

d do

cum

ente

d fo

r all

area

s.

A fo

rmal

trai

ning

pla

n ha

s

been

dev

elop

ed, b

ut fo

rmal

tra

inin

g is

stil

l bas

ed o

n

indi

vidu

al in

itiat

ives

.

Proc

ess

resp

onsi

bilit

y an

d

acco

unta

bilit

y ar

e de

fined

an

d pr

oces

s ow

ners

hav

e

been

iden

tifie

d. T

he p

roce

ss

owne

r is

unlik

ely

to h

ave

th

e fu

ll au

thor

ity to

exe

rcis

e

the

resp

onsi

bilit

ies.

Som

e ef

fect

iven

ess

goal

s an

d

mea

sure

s ar

e se

t, bu

t are

not

co

mm

unic

ated

, and

ther

e is

a

clea

r lin

k to

bus

ines

s go

als.

M

easu

rem

ent p

roce

sses

em

erge

, but

are

not

con

sist

ently

ap

plie

d. IT

bal

ance

d sc

orec

ard

id

eas

are

bein

g ad

opte

d, a

s is

oc

casi

onal

intu

itive

app

licat

ion

of

root

cau

se a

naly

sis.

4 Th

ere

is u

nder

stan

ding

of

the

full

requ

irem

ents

. M

atur

e co

mm

unic

atio

n

tech

niqu

es a

re a

pplie

d an

d

stan

dard

com

mun

icat

ion

to

ols

are

in u

se.

The

proc

ess

is s

ound

and

co

mpl

ete;

inte

rnal

bes

t pr

actic

es a

re a

pplie

d.

All a

spec

ts o

f the

pro

cess

ar

e do

cum

ente

d an

d

repe

atab

le. P

olic

ies

have

be

en a

ppro

ved

and

sign

ed o

ff on

by

man

agem

ent.

Stan

dard

s

for d

evel

opin

g an

d

mai

ntai

ning

the

proc

esse

s

and

proc

edur

es a

re a

dopt

ed

and

follo

wed

.

Tool

s ar

e im

plem

ente

d

acco

rdin

g to

a

stan

dard

ised

plan

, and

so

me

have

bee

n

inte

grat

ed w

ith o

ther

re

late

d to

ols.

To

ols

are

bein

g us

ed in

m

ain

area

s to

aut

omat

e

man

agem

ent o

f the

pr

oces

s an

d m

onito

r cr

itica

l act

ivitie

s an

d

cont

rols.

Skill

requ

irem

ents

are

ro

utin

ely

upda

ted

for a

ll ar

eas,

pr

ofic

ienc

y is

ens

ured

fo

r all

criti

cal a

reas

, and

ce

rtific

atio

n is

enc

oura

ged.

M

atur

e tra

inin

g te

chni

ques

ar

e ap

plie

d ac

cord

ing

to th

e

train

ing

plan

, and

kno

wle

dge

sh

arin

g is

enc

oura

ged.

All

in

tern

al d

omai

n ex

perts

are

in

volv

ed, a

nd th

e ef

fect

iven

ess

of

the

train

ing

plan

is a

sses

sed.

Proc

ess

resp

onsi

bilit

y an

d

acco

unta

bilit

y ar

e ac

cept

ed

and

wor

king

in a

way

that

en

able

s a

proc

ess

owne

r to

fully

dis

char

ge h

is/h

er

resp

onsi

bilit

ies.

A re

war

d

cultu

re is

in p

lace

that

m

otiv

ates

pos

itive

act

ion.

Effic

ienc

y an

d ef

fect

iven

ess

ar

e m

easu

red

and

co

mm

unic

ated

and

link

ed to

bu

sine

ss g

oals

and

the

IT

stra

tegi

c pl

an. T

he IT

bal

ance

d sc

orec

ard

is im

plem

ente

d

in s

ome

area

s w

ith e

xcep

tions

no

ted

by m

anag

emen

t and

root

ca

use

anal

ysis

is b

eing

st

anda

rdis

ed. C

ontin

uous

im

prov

emen

t is

emer

ging

.

5 Th

ere

is a

dvan

ced,

fo

rwar

d-lo

okin

g

unde

rsta

ndin

g of

re

quire

men

ts.

Proa

ctiv

e co

mm

unic

atio

n

of is

sues

bas

ed o

n tre

nds

ex

ists

, mat

ure

com

mun

icat

ion

te

chni

ques

are

app

lied,

and

in

tegr

ated

com

mun

icat

ion

to

ols

are

in u

se.

Exte

rnal

bes

t pra

ctic

es a

nd

stan

dard

s ar

e ap

plie

d.

Proc

ess

docu

men

tatio

n is

ev

olve

d to

aut

omat

ed

wor

kflo

ws.

Pro

cess

es,

polic

ies

and

proc

edur

es a

re

s tan

dard

ised

and

inte

grat

ed

to e

nabl

e en

d-to

-end

m

anag

emen

t and

im

prov

emen

t.

Stan

dard

ised

tool

set

s ar

e

used

acr

oss

the

ente

rpris

e.

Tool

s ar

e fu

lly in

tegr

ated

w

ith o

ther

rela

ted

tool

s to

en

able

end

-to-

end

su

ppor

t of t

he p

roce

sses

. To

ols

are

bein

g us

ed to

su

ppor

t im

prov

emen

t of t

he

proc

ess

and

auto

mat

ical

ly

dete

ct c

ontro

l exc

eptio

ns.

The

orga

nisa

tion

form

ally

en

cour

ages

con

tinuo

us

impr

ovem

ent o

f ski

lls, b

ased

on

cle

arly

defin

ed p

erso

nal

and

orga

nisa

tiona

l goa

ls.

Trai

ning

and

edu

catio

n

supp

ort e

xter

nal b

est p

ract

ices

and

use

of le

adin

g-ed

ge

conc

epts

and

tech

niqu

es.

Know

ledge

sha

ring

is an

ent

erpr

ise

cultu

re, a

nd k

now

ledg

e-ba

sed

syst

ems

are

bein

g de

ploy

ed.

Exte

rnal

exp

erts

and

indu

stry

le

ader

s ar

e us

ed fo

r gui

danc

e.

Proc

ess

owne

rs a

re

empo

wer

ed to

mak

e de

cisi

ons

and

take

act

ion.

Th

e ac

cept

ance

of

resp

onsi

bilit

y ha

s be

en

casc

aded

dow

n th

roug

hout

th

e or

gani

satio

n in

a

cons

iste

nt fa

shio

n.

Ther

e is

an

inte

grat

ed

perfo

rman

ce m

easu

rem

ent

syst

em li

nkin

g IT

per

form

ance

to

bus

ines

s go

als

by g

loba

l ap

plic

atio

n of

the

IT b

alan

ced

sc

orec

ard.

Exc

eptio

ns a

re

glob

ally

and

con

sist

ently

not

ed

by m

anag

emen

t and

root

ca

use

anal

ysis

is a

pplie

d.

Cont

inuo

us im

prov

emen

t is

a

way

of l

ife.

Appendix 2—Details of Maturity LevelsThis appendix includes a series of tables. Figure 90 includes the average maturity levels for the 51 enterprises as a group. Figures 91 through 102 include data divided by various demographic variables.

Figure 90—Average Maturity Levels for the Complete Study Population

Processes

Attributes

Awareness Policies Technology Skills Responsibility Goals Averages

PO1 Define a strategic IT plan. 2.89 2.26 1.72 2.29 2.69 2.36 2.37

PO2A Define the information architecture—Architecture. 2.71 2.17 1.95 2.30 2.39 1.74 2.21

PO2D Define the information architecture—Data classification.

2.34 2.05 2.00 2.16 2.21 1.62 2.06

PO3 Determine technological direction. 2.88 2.39 1.68 2.51 2.53 2.04 2.34

PO4O Define the IT processes—Organisation. 2.95 2.55 1.93 2.48 2.81 2.17 2.48

PO4P Define the IT processes—Processes. 2.87 2.42 1.70 2.29 2.65 2.12 2.34

PO5B Manage the IT investment—Budgeting. 3.41 3.03 2.81 2.86 3.15 2.73 3.00

PO5V Manage the IT investment—Value management. 2.26 1.93 1.49 1.89 2.09 1.94 1.93

PO6 Communicate management aims and direction. 2.77 2.50 2.10 2.32 2.54 1.94 2.36

PO7 Manage IT human resources. 3.10 2.98 2.49 2.67 3.01 2.39 2.77

PO8 Manage quality. 2.40 2.26 1.87 2.25 2.35 1.88 2.17

PO9 Assess and manage IT risks. 2.62 2.18 1.75 2.21 2.38 1.91 2.18

PO10PG Manage projects—Programme. 2.71 2.35 2.10 2.28 2.66 2.39 2.42

PO10PJ Manage projects—Projects. 3.28 3.06 2.63 2.72 3.03 2.58 2.88

AI1 Identify automated solutions. 2.95 2.67 2.08 2.45 2.68 2.23 2.51

AI2 Acquire and maintain application software. 3.29 3.04 2.52 2.87 3.20 2.59 2.92

AI3 Acquire and maintain technology infrastructure. 3.15 2.79 2.25 2.60 2.98 2.37 2.69

AI4 Enable operation and use. 2.87 2.60 2.15 2.51 2.71 2.12 2.49

AI5 Procure IT resources. 3.34 3.10 2.53 2.64 3.07 2.24 2.82

AI6 Manage changes. 3.08 2.92 2.50 2.46 2.73 2.04 2.62

AI7 Install and accredit solutions and changes. 3.12 2.81 2.39 2.65 2.92 2.20 2.68

DS1 Define and manage service levels. 2.54 2.13 1.88 2.15 2.26 1.94 2.15

DS2 Manage third-party services. 2.97 2.65 2.12 2.56 2.88 2.24 2.57

DS3 Manage performance and capacity. 2.82 2.29 2.40 2.53 2.78 2.20 2.50

DS4 Ensure continuous service. 2.94 2.59 2.38 2.52 2.69 2.29 2.57

DS5NF Ensure systems security—Network and firewall. 3.27 2.93 3.00 2.97 3.06 2.48 2.95

DS5P Ensure systems security—Policy. 2.97 2.86 2.30 2.68 3.03 2.32 2.69

DS5U Ensure systems security—User access. 3.14 2.88 2.63 2.72 3.07 2.40 2.81

DS5V Ensure systems security—Virus. 3.37 3.15 3.40 2.98 3.20 2.59 3.12

DS6 Identify and allocate costs. 2.94 2.54 2.43 2.59 2.84 2.09 2.57

DS7 Educate and train users. 2.66 2.55 2.24 2.60 2.70 2.26 2.50

DS8 Manage service desk and incidents. 3.22 2.76 2.98 2.79 3.09 2.65 2.92

DS9 Manage the configuration. 2.60 2.34 2.25 2.43 2.64 1.88 2.35

DS10 Manage problems. 2.80 2.52 2.50 2.54 2.78 2.18 2.55

DS11 Manage data. 3.16 3.04 3.04 2.77 3.11 2.42 2.92

DS12 Manage the physical environment. 3.53 3.14 3.06 3.00 3.33 2.50 3.09

IT Governance and Process Maturity

78 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 90—Average Maturity Levels for the Complete Study Population (cont.)

Processes

Attributes

Awareness Policies Technology Skills Responsibility Goals Averages

DS13 Manage operations. 3.19 2.91 2.72 2.67 3.16 2.46 2.85

ME1 Monitor and evaluate IT performance. 2.59 2.12 1.73 2.10 2.38 2.00 2.15

ME2 Monitor and evaluate internal control. 2.61 2.42 1.96 2.17 2.47 2.01 2.27

ME3 Ensure compliance with external requirements. 2.65 2.36 1.78 2.10 2.50 2.16 2.26

ME4 Provide IT governance. 2.61 2.42 1.77 2.27 2.63 2.04 2.29

Appendix 2—Details of Maturity Levels

79© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

1—Aw

aren

ess,

Pol

icie

s an

d Te

chno

logy

Attr

ibut

es:

Mat

urity

by

Geog

raph

ic L

ocat

ion

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

AGS

CNM

XPH

SGUS

AAG

SCN

MX

PHSG

USA

AGS

CNM

XPH

SGUS

A

PO1

Defin

e a

stra

tegi

c IT

pla

n.3.

13.

32.

42.

52.

83.

01.

93.

32.

21.

92.

32.

41.

42.

72.

01.

21.

71.

9

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

3.4

2.4

2.0

2.6

3.0

2.6

2.4

2.5

1.4

2.5

2.5

2.0

1.5

2.3

1.2

1.8

2.2

2.3

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

42.

62.

02.

42.

72.

31.

82.

32.

01.

92.

52.

01.

42.

21.

31.

42.

62.

4

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

3.0

3.1

2.4

2.5

3.2

3.0

2.1

2.7

2.1

2.2

2.9

2.5

1.7

1.2

1.0

1.1

2.1

2.1

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.3.

33.

52.

22.

43.

43.

12.

43.

52.

12.

13.

32.

62.

43.

31.

31.

02.

02.

1

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

3.3

3.8

2.2

2.1

2.9

3.1

2.1

3.3

1.7

1.8

2.8

2.7

2.3

2.0

1.0

0.9

1.4

2.1

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.9

3.7

2.6

3.4

3.1

3.5

3.1

3.3

2.4

3.4

3.0

3.0

3.4

2.8

1.8

2.8

2.6

2.8

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

3.5

3.0

1.2

2.1

3.1

2.0

2.8

2.7

0.8

1.8

2.6

1.8

2.3

2.5

0.7

0.8

2.3

1.4

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

3.1

3.2

1.9

2.2

3.3

2.9

2.4

3.0

2.0

2.0

3.1

2.7

2.4

3.2

1.5

1.4

2.8

2.1

PO7

Man

age

IT h

uman

reso

urce

s.3.

43.

82.

92.

53.

63.

13.

23.

82.

62.

43.

62.

92.

53.

51.

81.

93.

62.

5

PO8

Man

age

qual

ity.

2.1

2.5

1.5

2.3

3.6

2.5

1.5

2.7

2.1

1.9

3.5

2.4

1.4

2.7

1.3

1.3

2.9

2.0

PO9

Asse

ss a

nd m

anag

e IT

risk

s.3.

43.

32.

12.

52.

82.

32.

13.

21.

62.

12.

92.

01.

72.

81.

11.

32.

41.

8

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

3.5

3.3

2.9

2.3

2.6

2.7

3.0

3.3

1.9

1.7

2.0

2.5

3.5

3.1

1.8

1.0

1.9

2.4

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

93.

32.

43.

03.

53.

33.

53.

32.

22.

43.

63.

23.

03.

01.

71.

83.

22.

8

AI1

Iden

tify

auto

mat

ed s

olut

ions

.3.

23.

81.

92.

83.

42.

93.

03.

81.

62.

63.

22.

51.

83.

81.

31.

52.

52.

3

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.1

3.7

2.7

3.1

4.1

3.3

3.1

3.6

2.3

2.7

3.6

3.1

2.7

2.9

1.6

1.7

3.0

2.8

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.6

3.8

2.8

3.0

3.9

3.2

2.2

3.3

2.1

2.5

3.5

3.1

2.1

3.2

1.5

1.8

2.2

2.5

AI4

Enab

le o

pera

tion

and

use.

3.0

3.2

3.0

2.6

3.9

2.7

2.3

3.2

2.5

2.1

3.9

2.6

2.3

2.8

1.3

1.4

2.5

2.4

AI5

Proc

ure

IT re

sour

ces.

3.0

3.7

2.9

3.6

4.3

3.2

3.0

3.8

2.5

3.3

4.3

2.8

2.4

3.2

1.9

2.5

3.9

2.3

AI6

Man

age

chan

ges.

2.8

3.4

2.5

2.9

4.0

3.2

2.3

3.4

2.3

2.8

4.1

3.1

2.2

3.2

1.8

2.2

2.8

2.8

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.3.

23.

02.

62.

94.

03.

22.

83.

12.

12.

23.

83.

02.

82.

61.

51.

92.

52.

6

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

72.

71.

93.

12.

92.

31.

72.

52.

12.

52.

42.

11.

52.

51.

32.

01.

82.

1

DS2

Man

age

third

-par

ty s

ervi

ces.

3.1

3.5

2.5

3.0

4.1

2.7

2.4

2.8

2.3

3.0

4.0

2.4

2.3

2.7

1.5

2.1

3.3

1.9

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

3.5

2.5

2.2

2.7

3.6

2.6

2.2

2.1

2.0

2.3

3.2

2.2

2.6

2.3

2.0

2.4

3.3

2.2

DS4

Ensu

re c

ontin

uous

ser

vice

.3.

13.

22.

02.

94.

02.

81.

93.

22.

02.

83.

42.

71.

82.

61.

92.

12.

82.

7

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

63.

92.

82.

53.

93.

42.

33.

92.

12.

53.

53.

22.

53.

91.

82.

54.

03.

3

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.3.

43.

62.

62.

33.

33.

12.

53.

72.

12.

23.

73.

12.

43.

52.

81.

32.

52.

4

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

3.7

3.5

2.4

2.5

3.7

3.2

2.5

3.5

1.9

2.7

3.6

3.1

2.7

3.6

1.6

1.8

3.2

2.9

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

83.

72.

83.

03.

93.

43.

03.

82.

72.

83.

63.

23.

93.

72.

82.

44.

03.

7

DS6

Iden

tify

and

allo

cate

cos

ts.

3.9

4.0

1.4

3.0

3.0

2.7

3.1

4.0

0.8

2.3

2.7

2.6

3.4

3.8

1.0

1.8

2.6

2.3

DS7

Educ

ate

and

train

use

rs.

2.8

3.2

2.3

2.6

3.3

2.4

3.0

2.8

2.7

2.6

3.3

2.2

2.1

3.2

2.2

2.0

3.0

2.1

IT Governance and Process Maturity

80 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

1—Aw

aren

ess,

Pol

icie

s an

d Te

chno

logy

Attr

ibut

es:

Mat

urity

by

Geog

raph

ic L

ocat

ion

(con

t.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

AGS

CNM

XPH

SGUS

AAG

SCN

MX

PHSG

USA

AGS

CNM

XPH

SGUS

A

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

53.

52.

32.

94.

23.

22.

93.

02.

02.

33.

72.

83.

23.

31.

92.

74.

03.

0

DS9

Man

age

the

conf

igur

atio

n.2.

92.

61.

82.

72.

92.

51.

92.

42.

12.

32.

62.

52.

32.

61.

61.

82.

32.

5

DS10

Man

age

prob

lem

s.2.

42.

52.

12.

73.

83.

01.

82.

52.

02.

73.

72.

71.

92.

11.

52.

63.

72.

7

DS11

Man

age

data

.3.

53.

72.

23.

03.

63.

13.

33.

41.

83.

13.

53.

03.

53.

01.

73.

13.

43.

0

DS12

Man

age

the

phys

ical

env

ironm

ent.

4.1

3.7

3.0

3.0

4.2

3.5

3.3

3.4

2.6

2.6

3.8

3.3

3.6

3.5

2.5

2.3

4.1

3.1

DS13

Man

age

oper

atio

ns.

3.8

3.4

2.1

3.3

3.3

3.1

2.8

3.5

1.9

3.1

3.3

2.9

3.4

3.4

1.6

2.4

2.9

2.7

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.7

2.9

2.4

2.7

2.7

2.5

1.4

2.7

2.0

2.6

2.7

2.1

1.4

2.4

1.7

1.3

2.1

1.8

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

1.8

3.3

1.5

2.2

2.8

3.1

1.1

3.1

1.5

1.9

3.3

2.9

1.3

2.7

1.0

1.5

2.5

2.3

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.1

3.6

1.4

2.8

3.4

2.7

1.6

3.5

1.3

2.3

2.7

2.5

1.1

2.7

0.5

1.6

1.9

2.1

ME4

Pro

vide

IT g

over

nanc

e.2.

13.

81.

82.

42.

82.

81.

63.

81.

62.

12.

82.

71.

33.

31.

01.

41.

92.

0

Ove

rall

3.1

3.3

2.2

2.7

3.4

2.9

2.4

3.2

2.0

2.4

3.2

2.7

2.3

2.9

1.5

1.8

2.7

2.4

Sta

tistic

ally

sig

nific

ant

Yes

Yes

Yes

Figu

re 9

2—Sk

ills,

Res

pons

ibili

ties

and

Goal

s At

trib

utes

: M

atur

ity b

y Ge

ogra

phic

Loc

atio

n

Proc

esse

s

Skill

sRe

spon

sibi

litie

sGo

als

AGS

CNM

XPH

SGUS

AAG

SCN

MX

PHSG

USA

AGS

CNM

XPH

SGUS

A

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

63.

22.

41.

92.

52.

12.

43.

32.

42.

32.

83.

01.

73.

22.

42.

22.

22.

6

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.9

2.6

1.6

2.0

2.5

2.2

2.5

2.3

1.9

2.4

3.0

2.3

1.9

1.8

0.8

1.5

2.1

2.0

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

02.

01.

42.

02.

62.

42.

21.

81.

51.

92.

82.

41.

61.

70.

80.

91.

52.

1

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.7

2.9

2.5

2.2

2.7

2.4

2.6

2.9

2.1

2.1

2.8

2.7

1.8

2.4

1.4

1.7

2.3

2.3

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

63.

82.

12.

03.

02.

43.

33.

51.

82.

33.

52.

82.

13.

31.

21.

52.

32.

6

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.3

3.3

1.9

1.9

2.8

2.3

2.4

3.5

2.0

2.3

3.0

2.9

1.9

3.0

1.3

1.4

2.4

2.5

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.6

3.0

2.6

3.2

3.0

2.4

3.3

4.0

1.9

3.6

3.1

3.1

3.1

3.3

1.7

2.5

2.6

2.9

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.8

2.5

1.8

1.5

3.0

1.5

2.8

3.3

1.3

2.3

3.0

1.7

2.0

2.7

0.9

1.8

2.3

1.9

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.3

3.0

2.1

1.9

2.8

2.4

2.3

3.0

1.9

2.0

3.2

2.8

1.7

2.8

1.5

1.4

2.0

2.2

PO7

Man

age

IT h

uman

reso

urce

s.2.

73.

02.

92.

43.

82.

43.

23.

82.

92.

63.

62.

82.

33.

22.

21.

82.

82.

5

PO8

Man

age

qual

ity.

2.1

2.2

2.0

2.0

3.1

2.3

1.8

2.5

2.2

2.3

3.3

2.4

1.1

1.8

1.4

1.8

3.1

2.0

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

82.

81.

61.

83.

02.

02.

63.

22.

22.

02.

82.

31.

93.

01.

31.

52.

32.

0

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.5

3.3

2.2

2.2

2.1

2.2

3.0

3.3

2.7

2.2

2.2

2.8

2.5

3.2

2.5

2.2

2.0

2.4

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

13.

32.

12.

43.

32.

63.

43.

32.

42.

63.

53.

02.

53.

32.

32.

22.

92.

6

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

53.

72.

02.

12.

82.

42.

53.

72.

02.

43.

32.

71.

93.

31.

11.

92.

42.

5

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.2

3.3

2.4

2.6

3.4

2.8

3.6

3.5

2.9

3.0

3.9

3.0

2.3

2.8

2.0

2.2

3.0

2.8

Appendix 2—Details of Maturity Levels

81© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

2—Sk

ills,

Res

pons

ibili

ties

and

Goal

s At

trib

utes

: M

atur

ity b

y Ge

ogra

phic

Loc

atio

n (c

ont.)

Proc

esse

s

Skill

sRe

spon

sibi

litie

sGo

als

AGS

CNM

XPH

SGUS

AAG

SCN

MX

PHSG

USA

AGS

CNM

XPH

SGUS

A

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.2

3.0

2.0

2.4

2.7

2.9

2.8

3.3

2.5

2.8

3.4

3.1

1.9

2.6

1.9

2.2

2.4

2.6

AI4

Enab

le o

pera

tion

and

use.

2.7

3.0

2.7

2.1

3.0

2.4

2.6

3.2

2.9

2.3

3.7

2.6

2.1

2.8

2.3

1.8

2.2

2.1

AI5

Proc

ure

IT re

sour

ces.

2.7

3.5

2.1

3.1

3.3

2.3

2.8

3.7

3.0

3.2

3.9

2.8

2.1

3.2

1.6

2.6

2.3

2.1

AI6

Man

age

chan

ges.

2.1

3.2

2.1

2.4

3.5

2.4

2.2

3.2

2.3

2.6

4.0

2.8

1.5

2.3

1.7

1.9

2.5

2.2

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

82.

72.

12.

53.

72.

53.

23.

12.

42.

63.

92.

92.

02.

21.

42.

02.

52.

4

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

02.

21.

92.

22.

42.

21.

72.

81.

92.

62.

72.

31.

12.

31.

62.

02.

22.

1

DS2

Man

age

third

-par

ty s

ervi

ces.

3.0

2.9

2.3

2.7

3.6

2.1

3.0

3.2

2.2

3.2

4.0

2.5

2.2

2.7

1.7

2.5

2.6

2.1

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.7

2.3

2.2

2.7

3.2

2.4

2.8

2.4

2.2

2.7

3.8

2.8

2.6

2.3

1.6

1.9

2.4

2.3

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

13.

12.

12.

62.

62.

62.

13.

51.

92.

73.

52.

81.

83.

11.

82.

52.

32.

3

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

03.

32.

52.

33.

83.

13.

13.

92.

32.

33.

83.

22.

33.

71.

72.

12.

22.

7

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

43.

22.

72.

23.

02.

82.

83.

72.

12.

53.

63.

32.

33.

21.

71.

72.

32.

6

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.9

3.3

1.9

2.2

3.3

2.8

3.1

3.8

2.1

2.4

3.8

3.3

1.8

3.2

1.5

2.2

2.5

2.7

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.2.

93.

32.

52.

63.

53.

13.

23.

92.

42.

84.

23.

22.

33.

62.

62.

22.

32.

7

DS6

Iden

tify

and

allo

cate

cos

ts.

3.4

3.8

1.6

2.3

2.8

2.3

3.6

4.3

1.7

2.5

3.2

2.6

2.7

3.8

1.0

2.0

1.9

2.0

DS7

Educ

ate

and

train

use

rs.

2.8

3.3

2.3

2.6

2.8

2.4

3.0

3.5

2.7

2.7

3.2

2.3

2.2

3.3

2.3

2.5

2.4

2.0

DS8

Man

age

serv

ice

desk

and

inci

dent

s.2.

83.

32.

32.

53.

72.

73.

23.

62.

32.

83.

93.

12.

13.

22.

32.

43.

52.

8

DS9

Man

age

the

conf

igur

atio

n.2.

32.

32.

22.

42.

62.

52.

32.

52.

42.

33.

12.

91.

82.

11.

71.

61.

92.

0

DS10

Man

age

prob

lem

s.2.

22.

32.

12.

53.

52.

72.

02.

32.

42.

93.

83.

01.

31.

81.

32.

33.

42.

4

DS11

Man

age

data

.3.

13.

02.

03.

03.

12.

63.

53.

22.

52.

93.

53.

12.

43.

01.

52.

62.

22.

5

DS12

Man

age

the

phys

ical

env

ironm

ent.

4.1

3.3

2.3

2.6

3.3

2.9

3.9

3.5

2.8

3.0

3.9

3.3

3.1

2.7

2.0

2.3

2.7

2.4

DS13

Man

age

oper

atio

ns.

2.7

3.0

2.0

2.9

3.2

2.6

3.5

3.5

2.1

3.2

3.7

3.1

2.3

3.3

1.6

2.6

2.4

2.5

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.1

2.5

2.0

2.1

2.3

2.0

1.7

3.0

2.5

2.6

2.5

2.4

0.9

2.5

2.1

1.9

2.4

2.2

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

1.1

3.1

1.8

2.0

2.4

2.4

1.7

3.2

2.0

2.1

3.0

2.7

1.0

2.8

1.0

1.7

2.3

2.4

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

1.4

3.3

1.3

2.1

2.2

2.2

1.1

3.8

1.3

2.7

3.2

2.7

1.1

3.5

0.8

2.6

2.7

2.3

ME4

Pro

vide

IT g

over

nanc

e.1.

93.

71.

92.

12.

82.

21.

73.

52.

02.

63.

02.

81.

23.

51.

01.

92.

52.

2

Ove

rall

2.6

3.0

2.1

2.3

3.0

2.4

2.7

3.3

2.2

2.6

3.4

2.8

1.9

2.9

1.6

2.0

2.4

2.4

Sta

tistic

ally

sig

nfic

ant

NoYe

sYe

s

IT Governance and Process Maturity

82 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figure 93—Overall Maturity by Geographic Location

Processes

Location

AGS CN MX PH SG USAStatistically Significant

PO1 Define a strategic IT plan. 2.3 3.2 2.3 2.0 2.4 2.5

PO2A Define the information architecture—Architecture. 1.9 2.3 1.5 2.1 2.5 2.2

PO2D Define the information architecture—Data classification. 1.5 2.1 1.5 1.7 2.5 2.3

PO3 Determine technological direction. 2.2 2.5 1.9 2.0 2.7 2.5

PO4O Define the IT processes—Organisation. 2.7 3.5 1.8 1.9 2.9 2.6

PO4P Define the IT processes—Processes. 2.2 3.1 1.7 1.7 2.5 2.6

PO5B Manage the IT investment—Budgeting. 3.3 3.4 2.1 3.1 2.9 3.0

PO5V Manage the IT investment—Value management. 2.0 2.8 1.1 1.7 2.7 1.7

PO6 Communicate management aims and direction. 2.5 3.0 1.8 1.8 2.8 2.5

PO7 Manage IT human resources. 2.8 3.5 2.5 2.3 3.5 2.7

PO8 Manage quality. 1.8 2.4 1.7 1.9 3.2 2.3

PO9 Assess and manage IT risks. 2.2 3.1 1.7 1.9 2.7 2.1

PO10PG Manage projects—Programme. 3.0 3.3 2.3 1.9 2.1 2.5

PO10PJ Manage projects—Projects. 3.2 3.3 2.2 2.4 3.3 2.9

AI1 Identify automated solutions. 2.4 3.7 1.6 2.2 2.9 2.5 Yes

AI2 Acquire and maintain application software. 2.7 3.3 2.3 2.5 3.5 3.0

AI3 Acquire and maintain technology infrastructure. 2.3 3.2 2.1 2.4 3.0 2.9

AI4 Enable operation and use. 2.4 3.0 2.4 2.0 3.2 2.4

AI5 Procure IT resources. 2.8 3.5 2.3 3.1 3.7 2.6 Yes

AI6 Manage changes. 2.3 3.1 2.1 2.5 3.5 2.7 Yes

AI7 Install and accredit solutions and changes. 2.3 2.8 2.0 2.4 3.4 2.8

DS1 Define and manage service levels. 1.9 2.5 1.8 2.4 2.5 2.2

DS2 Manage third-party services. 2.2 3.0 2.1 2.7 3.6 2.3

DS3 Manage performance and capacity. 2.1 2.3 2.0 2.5 3.2 2.4

DS4 Ensure continuous service. 2.0 3.1 1.9 2.6 3.1 2.7

DS5NF Ensure systems security—Network and firewall. 2.4 3.8 2.2 2.4 3.5 3.2 Yes

DS5P Ensure systems security—Policy. 2.5 3.5 2.3 2.0 3.1 2.9 Yes

DS5U Ensure systems security—User access. 2.7 3.5 1.9 2.3 3.3 3.0 Yes

DS5V Ensure systems security—Virus. 2.9 3.7 2.6 2.6 3.6 3.2

DS6 Identify and allocate costs. 3.2 3.9 1.2 2.3 2.7 2.4 Yes

DS7 Educate and train users. 2.3 3.2 2.4 2.5 3.0 2.2

DS8 Manage service desk and incidents. 3.0 3.3 2.2 2.6 3.8 2.9

DS9 Manage the configuration. 2.2 2.4 1.9 2.2 2.6 2.5

DS10 Manage problems. 2.0 2.3 1.9 2.6 3.7 2.7 Yes

DS11 Manage data. 2.8 3.2 1.9 3.0 3.2 2.9

DS12 Manage the physical environment. 3.3 3.3 2.5 2.6 3.7 3.1

DS13 Manage operations. 2.8 3.3 1.9 2.9 3.1 2.8

ME1 Monitor and evaluate IT performance. 1.9 2.7 2.1 2.2 2.4 2.2

ME2 Monitor and evaluate internal control. 1.3 3.0 1.5 1.9 2.7 2.6 Yes

ME3 Ensure compliance with external requirements. 1.4 3.4 1.1 2.3 2.7 2.4 Yes

ME4 Provide IT governance. 1.7 3.6 1.5 2.1 2.6 2.4 Yes

Overall 2.4 3.1 1.9 2.3 3.0 2.6

Appendix 2—Details of Maturity Levels

83© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

4—M

atur

ity L

evel

for E

mer

ging

vs.

Dev

elop

ed C

ount

ries

(con

t.)

Proc

esse

s

Attr

ibut

es in

Em

ergi

ng (E

) or D

evel

oped

(D) C

ount

ries

Stat

istic

ally

Si

gnifi

cant

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

l

ED

ED

ED

ED

ED

ED

ED

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

53.

02.

02.

41.

51.

82.

12.

42.

32.

82.

22.

42.

12.

5

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.4

2.8

2.1

2.2

1.5

2.1

1.9

2.5

2.2

2.4

1.2

1.9

1.9

2.2

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

22.

41.

92.

11.

32.

21.

82.

31.

82.

40.

91.

91.

62.

1

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.5

3.0

2.2

2.5

1.1

1.9

2.3

2.6

2.1

2.7

1.6

2.2

2.0

2.4

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

33.

22.

12.

71.

12.

22.

02.

62.

13.

11.

42.

51.

92.

7Ye

s

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.2

3.1

1.7

2.7

0.9

2.0

1.9

2.4

2.2

2.8

1.4

2.4

1.7

2.5

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.0

3.5

2.9

3.1

2.4

2.9

2.9

2.8

2.8

3.2

2.1

2.9

2.7

3.1

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

1.7

2.4

1.4

2.1

0.7

1.7

1.6

2.0

1.8

2.2

1.4

2.1

1.4

2.0

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.1

3.0

2.0

2.7

1.4

2.4

1.9

2.5

2.0

2.8

1.4

2.2

1.8

2.6

Yes

PO7

Man

age

IT h

uman

reso

urce

s.2.

63.

32.

53.

21.

82.

72.

62.

72.

73.

11.

92.

62.

42.

9Ye

s

PO8

Man

age

qual

ity.

2.0

2.5

2.0

2.3

1.3

2.1

2.0

2.3

2.3

2.4

1.6

2.0

1.8

2.2

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

42.

71.

92.

31.

21.

91.

72.

42.

12.

51.

42.

11.

82.

3

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.6

2.8

1.8

2.6

1.3

2.4

2.2

2.3

2.4

2.8

2.3

2.4

2.1

2.6

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.2.

83.

52.

33.

31.

82.

92.

32.

92.

53.

22.

22.

72.

33.

1Ye

s

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

53.

12.

22.

81.

42.

32.

12.

62.

32.

81.

62.

52.

02.

7Ye

s

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.0

3.4

2.5

3.2

1.7

2.8

2.6

3.0

2.9

3.3

2.1

2.8

2.5

3.0

Yes

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.9

3.2

2.4

3.0

1.7

2.5

2.3

2.7

2.7

3.1

2.1

2.5

2.3

2.8

AI4

Enab

le o

pera

tion

and

use.

2.7

2.9

2.2

2.7

1.4

2.4

2.2

2.6

2.5

2.8

1.9

2.2

2.2

2.6

AI5

Proc

ure

IT re

sour

ces.

3.4

3.3

3.0

3.1

2.3

2.6

2.7

2.6

3.1

3.1

2.2

2.2

2.8

2.9

AI6

Man

age

chan

ges.

2.8

3.2

2.6

3.0

2.1

2.6

2.3

2.5

2.5

2.8

1.8

2.1

2.4

2.7

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

83.

32.

23.

01.

82.

62.

42.

82.

53.

11.

82.

42.

22.

7

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

62.

52.

32.

11.

72.

02.

12.

22.

32.

31.

92.

02.

12.

1

DS2

Man

age

third

-par

ty s

ervi

ces.

2.8

3.0

2.7

2.6

1.9

2.2

2.5

2.6

2.8

2.9

2.2

2.3

2.5

2.5

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.5

2.9

2.2

2.3

2.3

2.4

2.5

2.5

2.5

2.9

1.8

2.3

2.3

2.4

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

53.

12.

52.

62.

02.

52.

42.

62.

42.

82.

22.

32.

32.

5

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.2.

63.

52.

43.

12.

23.

32.

43.

22.

33.

42.

02.

72.

33.

0Ye

s

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

43.

22.

23.

21.

82.

52.

32.

82.

33.

31.

72.

62.

12.

8Ye

s

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.5

3.4

2.4

3.1

1.7

3.0

2.1

3.0

2.3

3.4

1.9

2.6

2.2

3.0

Yes

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.2.

93.

52.

83.

32.

53.

72.

63.

12.

73.

42.

32.

72.

63.

2Ye

s

DS6

Iden

tify

and

allo

cate

cos

ts.

2.2

3.1

1.5

2.8

1.4

2.7

1.9

2.8

2.1

3.0

1.5

2.3

1.8

2.8

Yes

IT Governance and Process Maturity

84 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

4—M

atur

ity L

evel

for E

mer

ging

vs.

Dev

elop

ed C

ount

ries

(con

t.)

Proc

esse

s

Attr

ibut

es in

Em

ergi

ng (E

) or D

evel

oped

(D) C

ount

ries

Stat

istic

ally

Si

gnifi

cant

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

l

ED

ED

ED

ED

ED

ED

ED

DS7

Educ

ate

and

train

use

rs.

2.5

2.7

2.6

2.5

2.1

2.3

2.5

2.6

2.7

2.7

2.4

2.2

2.5

2.4

DS8

Man

age

serv

ice

desk

and

inci

dent

s.2.

73.

42.

23.

02.

43.

22.

42.

92.

73.

32.

42.

82.

53.

1Ye

s

DS9

Man

age

the

conf

igur

atio

n.2.

32.

72.

22.

41.

72.

42.

32.

52.

32.

71.

72.

02.

12.

4

DS10

Man

age

prob

lem

s.2.

52.

92.

42.

62.

22.

62.

32.

62.

72.

81.

92.

32.

32.

5

DS11

Man

age

data

.2.

83.

32.

73.

22.

73.

22.

72.

82.

83.

22.

32.

52.

72.

9

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.0

3.7

2.6

3.3

2.4

3.3

2.5

3.2

2.9

3.5

2.2

2.6

2.6

3.2

Yes

DS13

Man

age

oper

atio

ns.

2.9

3.3

2.7

3.0

2.1

2.9

2.6

2.7

2.8

3.3

2.3

2.5

2.5

2.9

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.6

2.6

2.3

2.1

1.5

1.8

2.1

2.1

2.5

2.3

2.0

2.0

2.2

2.1

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

1.9

2.8

1.8

2.6

1.3

2.2

1.9

2.3

2.1

2.6

1.4

2.2

1.7

2.3

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.3

2.8

1.9

2.5

1.2

2.0

1.8

2.2

2.1

2.6

1.9

2.3

1.8

2.2

ME4

Pro

vide

IT g

over

nanc

e.2.

12.

82.

02.

61.

32.

02.

02.

42.

42.

71.

62.

21.

92.

3

Ove

rall

2.5

3.0

2.3

2.7

1.7

2.5

2.2

2.6

2.4

2.9

1.9

2.3

2.2

2.6

Sta

tistic

ally

sig

nific

ant

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Figu

re 9

5—Aw

aren

ess,

Pol

icie

s, T

echn

olog

y an

d Sk

ills

Attr

ibut

es:

Mat

urity

Lev

el b

y In

dust

ry

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

s

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

33.

63.

03.

02.

91.

83.

22.

12.

32.

21.

62.

71.

11.

71.

81.

73.

12.

32.

42.

3

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.4

2.8

2.9

3.0

2.4

1.8

2.8

2.1

2.3

2.1

1.7

1.8

1.6

2.4

2.1

2.1

2.3

2.4

2.6

2.1

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure

—Da

ta c

lass

ifica

tion.

2.2

2.7

3.0

2.3

1.8

2.0

2.3

2.3

1.9

2.0

1.9

2.0

2.3

1.9

2.1

2.1

2.5

2.1

1.9

2.6

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.4

3.3

2.9

3.0

3.0

1.9

2.7

2.3

2.6

2.5

1.1

2.0

1.8

1.6

2.0

2.3

3.0

2.4

2.6

2.3

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

73.

53.

12.

82.

92.

53.

42.

62.

42.

41.

82.

71.

61.

72.

32.

13.

42.

42.

52.

4

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.6

3.6

2.8

2.9

2.9

2.3

3.7

2.1

2.3

2.4

1.8

2.8

1.3

1.7

1.2

2.0

3.6

1.9

2.4

2.1

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.0

3.9

3.1

3.9

3.1

2.7

3.4

2.9

3.3

2.8

2.7

3.0

2.8

3.0

2.6

2.7

3.2

2.7

3.2

2.5

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

1.6

2.7

2.8

2.3

2.1

1.5

2.4

2.2

1.8

2.0

0.9

2.4

1.6

1.4

1.5

1.6

2.6

2.1

1.6

1.8

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.4

3.5

3.0

2.6

2.7

2.2

3.6

1.9

2.5

2.5

1.4

3.0

1.8

2.2

2.3

2.0

3.1

2.2

2.3

2.3

Appendix 2—Details of Maturity Levels

85© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

5—Aw

aren

ess,

Pol

icie

s, T

echn

olog

y an

d Sk

ills

Attr

ibut

es:

Mat

urity

Lev

el b

y In

dust

ry (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

s

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

PO7

Man

age

IT h

uman

reso

urce

s.3.

13.

43.

22.

83.

32.

83.

33.

02.

83.

12.

12.

82.

92.

13.

02.

22.

72.

92.

72.

9

PO8

Man

age

qual

ity.

1.5

3.1

2.8

2.3

2.8

1.7

2.8

2.3

2.2

2.7

1.3

2.2

2.2

1.8

2.3

1.6

2.8

2.7

2.2

2.5

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

33.

13.

02.

72.

12.

12.

52.

22.

41.

81.

71.

42.

11.

81.

62.

02.

52.

62.

31.

8

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.8

3.3

2.4

2.9

2.3

2.0

3.1

2.4

2.5

2.1

1.8

2.8

2.5

1.9

2.0

2.0

2.9

2.1

2.4

2.2

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

13.

73.

53.

33.

02.

73.

23.

43.

03.

22.

33.

23.

02.

62.

52.

43.

13.

12.

62.

6

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

53.

53.

13.

02.

92.

33.

32.

42.

82.

71.

62.

91.

82.

12.

32.

13.

12.

22.

62.

3

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.4

3.3

3.2

3.2

3.3

2.9

3.3

2.9

3.0

3.1

2.5

2.8

2.4

2.5

2.6

2.9

3.1

2.9

2.9

2.6

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy

infra

stru

ctur

e.2.

83.

93.

23.

03.

22.

33.

33.

02.

82.

81.

72.

52.

52.

52.

02.

53.

12.

62.

62.

3

AI4

Enab

le o

pera

tion

and

use.

3.2

2.6

3.1

2.5

3.2

2.7

2.7

2.7

2.2

3.0

2.3

2.3

2.5

2.0

2.0

2.9

2.4

2.4

2.2

2.7

AI5

Proc

ure

IT re

sour

ces.

2.9

3.7

3.4

3.3

3.7

2.5

3.5

3.3

3.2

3.3

1.9

3.1

2.4

2.5

3.1

2.4

3.0

2.5

2.7

2.7

AI6

Man

age

chan

ges.

2.8

3.6

3.1

3.1

3.1

2.7

3.8

2.4

2.9

3.1

2.2

3.4

2.1

2.5

2.6

2.3

3.0

2.2

2.5

2.5

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

93.

33.

63.

13.

02.

63.

03.

52.

72.

62.

02.

83.

02.

42.

02.

43.

02.

72.

72.

5

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

43.

42.

52.

62.

12.

32.

61.

82.

11.

92.

02.

81.

22.

01.

62.

52.

81.

72.

12.

0

DS2

Man

age

third

-par

ty s

ervi

ces.

2.6

3.1

3.0

3.0

3.2

2.3

2.6

2.6

2.8

3.0

1.9

2.1

2.0

2.1

2.5

2.2

2.9

2.6

2.6

2.5

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.8

3.5

2.8

2.6

2.7

2.3

3.1

1.9

2.2

2.2

2.5

3.2

2.1

2.3

2.0

2.5

3.3

2.6

2.3

2.3

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

43.

32.

63.

52.

82.

23.

01.

93.

22.

32.

42.

71.

82.

82.

02.

53.

22.

12.

91.

8

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.2.

93.

33.

83.

23.

32.

73.

33.

03.

02.

82.

43.

03.

53.

13.

12.

73.

23.

42.

83.

1

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

33.

03.

23.

32.

82.

33.

23.

23.

12.

72.

72.

32.

52.

22.

02.

73.

02.

82.

72.

4

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.8

3.4

3.5

3.2

2.9

2.4

3.2

2.9

3.0

3.0

2.4

3.1

2.8

2.5

2.7

2.3

3.5

3.1

2.5

2.6

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.2.

83.

73.

93.

43.

22.

83.

53.

53.

22.

82.

93.

64.

03.

53.

22.

73.

43.

43.

02.

7

DS6

Iden

tify

and

allo

cate

cos

ts.

2.1

4.1

3.6

3.1

2.3

1.9

3.9

2.5

2.7

2.2

1.8

3.6

2.1

2.7

2.5

2.1

3.6

2.7

2.6

2.4

DS7

Educ

ate

and

train

use

rs.

2.3

3.3

2.9

2.5

2.7

2.4

3.3

2.8

2.2

2.5

2.3

3.2

2.3

2.0

2.0

2.5

3.2

2.7

2.6

2.3

DS8

Man

age

serv

ice

desk

and

inci

dent

s.2.

93.

43.

53.

33.

12.

63.

02.

92.

82.

72.

93.

33.

03.

02.

82.

63.

03.

02.

92.

6

DS9

Man

age

the

conf

igur

atio

n.2.

03.

62.

62.

72.

41.

93.

32.

02.

52.

31.

82.

82.

32.

42.

12.

13.

32.

42.

42.

2

DS10

Man

age

prob

lem

s.2.

42.

92.

72.

83.

32.

23.

02.

02.

72.

92.

12.

82.

02.

73.

02.

32.

72.

22.

62.

8

DS11

Man

age

data

.2.

93.

83.

53.

22.

72.

83.

73.

53.

02.

62.

93.

93.

62.

92.

32.

53.

73.

12.

62.

3

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.1

4.0

4.2

3.4

3.3

2.7

3.4

3.7

3.1

3.1

3.1

3.3

3.6

2.7

3.0

2.6

3.2

3.9

3.0

2.5

DS13

Man

age

oper

atio

ns.

3.0

3.6

3.4

3.4

2.6

2.8

3.5

2.7

3.1

2.5

2.8

3.3

2.5

2.9

2.2

2.7

3.2

2.4

2.9

2.4

IT Governance and Process Maturity

86 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

5—Aw

aren

ess,

Pol

icie

s, T

echn

olog

y an

d Sk

ills

Attr

ibut

es:

Mat

urity

Lev

el b

y In

dust

ry (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

s

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.2

3.2

2.4

2.9

2.5

2.0

2.9

1.5

2.4

2.0

1.2

2.8

1.4

1.9

1.7

2.1

2.7

1.8

2.1

2.1

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.2

3.1

2.4

2.9

2.5

1.8

2.8

2.4

2.7

2.5

1.5

2.2

1.8

2.2

2.1

1.6

2.8

1.6

2.5

2.3

ME3

Ens

ure

com

plia

nce

with

ext

erna

l re

quire

men

ts.

1.9

3.0

3.3

3.1

2.2

1.6

2.8

2.9

2.8

2.0

0.9

2.4

2.3

2.2

1.3

1.3

2.6

2.3

2.8

1.6

ME4

Pro

vide

IT g

over

nanc

e.1.

83.

32.

73.

02.

41.

63.

02.

22.

92.

40.

82.

41.

62.

31.

81.

43.

22.

22.

62.

2

Ove

rall

2.6

3.4

3.1

3.0

2.8

2.3

3.1

2.6

2.7

2.5

2.0

2.8

2.3

2.3

2.2

2.2

3.0

2.5

2.6

2.4

Sta

tistic

ally

sig

nific

ant

Yes

Yes

NoYe

s

Figu

re 9

6—Re

spon

sibi

lity,

Goa

ls a

nd O

vera

ll At

trib

utes

: M

atur

ity L

evel

by

Indu

stry

Proc

esse

s

Resp

onsi

bilit

yGo

als

Over

all

Stat

istic

ally

Si

gnifi

cant

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

53.

02.

72.

72.

62.

33.

31.

82.

32.

51.

92.

92.

32.

42.

4

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.4

2.0

2.4

2.9

2.0

1.5

1.7

1.6

2.1

1.7

1.8

2.1

2.1

2.5

2.1

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

32.

42.

42.

02.

21.

82.

11.

51.

51.

51.

82.

12.

21.

92.

0

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.2

3.0

2.4

2.7

2.3

1.7

2.8

1.9

2.0

2.1

1.9

2.6

2.3

2.4

2.4

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

43.

72.

82.

82.

82.

03.

11.

92.

22.

12.

43.

02.

52.

42.

5

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.5

3.7

2.3

2.6

2.6

1.8

2.9

1.8

2.2

2.3

1.9

3.0

2.2

2.3

2.2

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

2.4

3.7

3.2

3.7

2.6

2.4

3.0

2.8

3.3

2.2

2.9

3.1

2.9

3.4

2.6

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

1.8

2.9

2.1

2.0

1.9

1.5

2.7

2.0

1.9

1.8

1.5

2.3

2.1

1.8

1.9

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.4

3.2

2.4

2.4

2.5

1.6

2.9

1.8

1.9

1.9

2.1

3.0

2.3

2.3

2.4

PO7

Man

age

IT h

uman

reso

urce

s.2.

93.

53.

32.

92.

82.

03.

12.

52.

32.

52.

43.

13.

12.

62.

9

PO8

Man

age

qual

ity.

1.9

2.9

2.7

2.1

2.6

1.2

2.2

1.9

2.1

2.1

1.6

2.4

2.3

2.1

2.5

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

52.

62.

42.

32.

22.

02.

21.

72.

01.

72.

02.

32.

32.

31.

9

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.6

3.1

2.3

2.9

2.4

2.5

2.9

2.2

2.5

1.9

2.3

2.9

2.6

2.5

2.1

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.2.

83.

53.

53.

02.

82.

72.

82.

82.

52.

32.

53.

23.

32.

82.

7

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

33.

32.

72.

72.

71.

93.

21.

92.

22.

32.

12.

92.

42.

62.

5

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.2

3.3

3.3

3.2

3.2

2.6

3.0

2.4

2.6

2.5

2.8

2.9

2.8

2.9

2.9

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

3.0

3.4

3.2

3.0

2.6

2.4

2.8

1.9

2.4

2.3

2.4

2.9

2.8

2.7

2.6

AI4

Enab

le o

pera

tion

and

use.

3.2

2.8

2.6

2.3

2.9

2.4

2.2

1.9

2.0

2.1

2.7

2.2

2.6

2.2

2.6

AI5

Proc

ure

IT re

sour

ces.

2.9

3.2

3.1

2.9

3.5

2.1

2.8

1.7

2.3

2.5

2.5

3.2

2.8

2.8

3.1

AI6

Man

age

chan

ges.

2.4

3.4

2.4

2.8

2.9

2.1

3.6

1.5

1.8

1.8

2.4

3.1

2.4

2.6

2.7

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

73.

33.

32.

92.

62.

22.

42.

12.

22.

12.

32.

63.

12.

72.

5

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

42.

81.

82.

22.

32.

33.

01.

41.

81.

72.

22.

41.

82.

12.

0

Appendix 2—Details of Maturity Levels

87© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

6—Re

spon

sibi

lity,

Goa

ls a

nd O

vera

ll At

trib

utes

: M

atur

ity L

evel

by

Indu

stry

(con

t.)

Proc

esse

s

Resp

onsi

bilit

yGo

als

Over

all

Stat

istic

ally

Si

gnifi

cant

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

Cap

Util

Srv

Fin

Govt

DS2

Man

age

third

-par

ty s

ervi

ces.

2.4

3.0

2.9

3.1

2.9

2.1

2.4

1.7

2.4

2.5

2.2

2.1

2.5

2.7

2.8

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.8

3.5

2.6

2.5

2.8

2.4

2.9

1.4

2.2

2.1

2.4

2.6

2.0

2.4

2.4

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

33.

12.

33.

12.

42.

13.

11.

62.

81.

72.

22.

62.

03.

02.

2

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.2.

63.

33.

23.

23.

12.

52.

91.

82.

62.

52.

62.

73.

03.

03.

0

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

63.

33.

33.

12.

92.

22.

32.

42.

61.

92.

52.

52.

92.

82.

5

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.8

3.6

3.1

3.0

3.1

2.4

3.0

2.1

2.4

2.3

2.5

3.0

2.9

2.8

2.8

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.2.

93.

33.

43.

33.

02.

83.

01.

92.

82.

32.

73.

23.

43.

22.

9

DS6

Iden

tify

and

allo

cate

cos

ts.

2.5

3.7

2.8

3.0

2.5

1.8

3.4

1.7

2.3

1.9

2.3

3.2

2.6

2.7

2.3

DS7

Educ

ate

and

train

use

rs.

2.8

3.2

2.7

2.6

2.4

2.3

3.0

2.0

2.4

1.8

2.3

2.7

2.6

2.4

2.3

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

13.

33.

13.

12.

82.

53.

42.

42.

72.

72.

73.

23.

03.

02.

8

DS9

Man

age

the

conf

igur

atio

n.2.

33.

22.

52.

62.

81.

82.

71.

32.

01.

71.

93.

02.

22.

42.

2

DS10

Man

age

prob

lem

s.2.

72.

92.

52.

73.

32.

02.

91.

62.

12.

52.

12.

62.

22.

62.

9

DS11

Man

age

data

.3.

13.

83.

62.

92.

62.

63.

02.

42.

41.

92.

73.

23.

32.

82.

4

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.1

3.5

4.1

3.3

3.0

2.4

3.0

2.4

2.6

2.2

2.9

3.0

3.7

3.0

2.8

DS13

Man

age

oper

atio

ns.

3.0

3.5

3.4

3.2

2.9

2.5

3.2

2.0

2.6

2.0

2.7

3.2

2.7

3.0

2.4

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.3

3.2

1.5

2.6

2.4

2.0

2.3

1.2

2.2

2.2

2.0

2.7

1.6

2.3

2.1

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.0

2.8

2.7

2.5

2.6

1.5

2.0

2.1

2.3

2.2

1.6

2.2

2.0

2.5

2.4

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

1.9

2.6

2.7

3.1

2.1

1.3

2.1

2.4

2.8

2.1

1.5

2.0

2.4

2.8

1.9

Yes

ME4

Pro

vide

IT g

over

nanc

e.2.

13.

22.

42.

82.

81.

22.

72.

12.

32.

21.

42.

52.

22.

62.

3Ye

s

Ove

rall

2.6

3.2

2.8

2.8

2.7

2.0

2.8

1.9

2.3

2.1

2.2

2.7

2.6

2.6

2.5

Sta

tistic

ally

sig

nific

ant

NoNo

No

Figu

re 9

7—M

atur

ity L

evel

by

Rela

tive

Size

of I

T Or

gani

satio

n

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

63.

31.

92.

71.

52.

01.

92.

72.

33.

12.

12.

72.

12.

6Ye

s

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.4

3.1

1.8

2.7

1.9

2.1

2.0

2.7

2.0

2.8

1.6

1.9

1.9

2.4

Yes

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.1.

92.

81.

72.

41.

92.

22.

02.

41.

92.

51.

41.

81.

72.

3Ye

s

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.6

3.2

2.1

2.8

1.5

2.0

2.1

2.9

2.0

3.2

1.7

2.4

2.0

2.7

Yes

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

63.

32.

22.

91.

72.

22.

22.

82.

63.

11.

82.

62.

32.

8Ye

s

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.3

3.4

2.0

2.8

1.2

2.2

1.9

2.7

2.2

3.1

1.7

2.5

1.9

2.7

Yes

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.1

3.7

2.7

3.4

2.5

3.1

2.6

3.1

2.7

3.6

2.3

3.1

2.7

3.3

Yes

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.0

2.5

1.8

2.1

1.2

1.7

1.7

2.1

1.9

2.3

1.7

2.2

1.7

2.1

IT Governance and Process Maturity

88 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

7—M

atur

ity L

evel

by

Rela

tive

Size

of I

T Or

gani

satio

n (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.2

3.3

2.1

2.9

1.8

2.5

2.2

2.5

2.1

2.9

1.6

2.3

2.1

2.7

Yes

PO7

Man

age

IT h

uman

reso

urce

s.2.

83.

42.

73.

32.

32.

72.

52.

92.

73.

32.

12.

72.

53.

1Ye

s

PO8

Man

age

qual

ity.

2.3

2.5

2.0

2.5

1.8

2.0

2.1

2.4

2.0

2.7

1.9

1.9

2.1

2.2

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

33.

02.

02.

41.

61.

92.

12.

42.

22.

61.

62.

11.

92.

3

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.2

3.1

1.9

2.8

1.8

2.4

2.1

2.5

2.2

3.1

1.8

2.9

2.1

2.8

Yes

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

03.

52.

93.

22.

62.

72.

62.

82.

73.

32.

32.

72.

73.

0

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

73.

12.

42.

91.

92.

22.

32.

52.

42.

92.

02.

42.

32.

6

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.1

3.5

2.8

3.2

2.2

2.8

2.5

3.2

2.9

3.4

2.2

2.8

2.6

3.1

Yes

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.7

3.6

2.4

3.2

1.9

2.6

2.4

2.8

2.7

3.3

1.9

2.7

2.4

2.9

Yes

AI4

Enab

le o

pera

tion

and

use.

2.5

3.2

2.3

2.9

1.7

2.5

2.2

2.8

2.2

3.1

1.8

2.4

2.1

2.7

Yes

AI5

Proc

ure

IT re

sour

ces.

3.2

3.5

3.0

3.3

2.4

2.7

2.6

2.7

3.0

3.2

2.0

2.4

2.7

3.0

AI6

Man

age

chan

ges.

3.0

3.2

2.7

3.1

2.2

2.8

2.4

2.5

2.6

2.9

1.9

2.2

2.4

2.8

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

93.

32.

63.

02.

22.

52.

62.

72.

83.

12.

12.

32.

52.

7

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

42.

71.

92.

31.

62.

11.

82.

42.

12.

31.

82.

01.

92.

2

DS2

Man

age

third

-par

ty s

ervi

ces.

2.8

3.1

2.6

2.7

2.0

2.2

2.5

2.6

2.7

3.1

2.0

2.4

2.3

2.6

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.6

3.0

2.0

2.6

2.1

2.7

2.3

2.8

2.5

3.1

1.8

2.6

2.1

2.6

Yes

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

63.

32.

13.

21.

82.

92.

03.

02.

23.

11.

82.

82.

03.

0Ye

s

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.2.

93.

72.

53.

42.

63.

42.

63.

42.

73.

51.

93.

02.

53.

2Ye

s

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

63.

42.

63.

21.

82.

82.

23.

22.

73.

31.

92.

82.

33.

0Ye

s

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.8

3.5

2.6

3.2

2.3

3.0

2.4

3.0

2.6

3.5

2.0

2.7

2.4

3.1

Yes

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

03.

82.

73.

63.

03.

82.

63.

52.

73.

72.

03.

22.

63.

5Ye

s

DS6

Iden

tify

and

allo

cate

cos

ts.

2.5

3.4

2.3

2.8

2.1

2.8

2.4

2.8

2.6

3.1

1.7

2.4

2.4

2.8

DS7

Educ

ate

and

train

use

rs.

2.6

2.8

2.5

2.6

2.1

2.4

2.5

2.7

2.5

2.9

2.2

2.4

2.3

2.6

DS8

Man

age

serv

ice

desk

and

inci

dent

s.2.

93.

52.

43.

12.

63.

42.

53.

12.

63.

52.

23.

02.

63.

3Ye

s

DS9

Man

age

the

conf

igur

atio

n.2.

42.

82.

12.

61.

92.

62.

22.

62.

42.

81.

62.

02.

12.

5

DS10

Man

age

prob

lem

s.2.

82.

82.

52.

52.

62.

52.

62.

52.

82.

72.

22.

12.

52.

5

DS11

Man

age

data

.2.

93.

52.

73.

42.

63.

52.

63.

02.

83.

52.

12.

82.

63.

2Ye

s

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.2

3.9

2.8

3.6

2.6

3.5

2.6

3.5

3.0

3.7

2.0

3.0

2.7

3.4

Yes

DS13

Man

age

oper

atio

ns.

2.8

3.5

2.6

3.2

2.4

2.9

2.5

2.8

2.9

3.4

2.1

2.7

2.6

3.0

Yes

Appendix 2—Details of Maturity Levels

89© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

7—M

atur

ity L

evel

by

Rela

tive

Size

of I

T Or

gani

satio

n (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.4

2.7

2.0

2.1

1.4

2.0

1.9

2.3

2.1

2.6

1.9

2.0

2.0

2.2

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.3

2.8

2.2

2.5

1.8

2.1

2.0

2.2

2.3

2.5

1.8

2.1

1.9

2.3

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.5

2.7

2.0

2.6

1.5

2.1

1.9

2.3

2.4

2.5

2.1

2.3

2.0

2.2

ME4

Pro

vide

IT g

over

nanc

e.2.

33.

02.

22.

71.

52.

02.

12.

52.

42.

81.

82.

32.

02.

4

Ove

rall

2.6

3.2

2.3

2.9

2.0

2.5

2.3

2.7

2.5

3.0

1.9

2.5

2.3

2.7

Sta

tistic

ally

sig

nific

ant

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Figu

re 9

8—M

atur

ity L

evel

for A

ttrib

utes

by

Annu

al E

xpen

ditu

re in

IT a

s a

Perc

ent o

f Rev

enue

(con

t.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

PO1

Defin

e a

stra

tegi

c IT

pla

n.3.

12.

72.

72.

12.

11.

62.

62.

02.

92.

62.

82.

32.

72.

2

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.9

2.6

2.5

1.9

2.2

2.0

2.6

2.1

2.8

2.3

2.1

1.7

2.5

2.1

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

82.

22.

51.

92.

52.

02.

52.

12.

42.

32.

11.

62.

52.

0

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

3.2

2.8

2.8

2.4

2.2

1.6

2.7

2.5

3.0

2.4

2.4

2.1

2.7

2.3

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.3.

22.

83.

12.

62.

31.

72.

62.

43.

12.

62.

82.

12.

82.

4

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

3.4

2.7

3.1

2.4

2.4

1.5

2.6

2.2

3.3

2.7

2.7

2.2

2.9

2.3

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.5

3.3

3.3

2.9

3.0

2.6

2.8

2.7

3.3

3.2

3.0

2.6

3.1

2.9

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.6

2.0

2.6

1.5

2.1

1.1

2.3

1.7

2.4

1.9

2.6

1.7

2.4

1.7

Yes

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

3.1

2.5

2.5

2.6

2.3

1.9

2.5

2.3

2.7

2.5

2.2

1.9

2.6

2.3

PO7

Man

age

IT h

uman

reso

urce

s.3.

32.

93.

12.

82.

62.

52.

72.

73.

22.

92.

82.

32.

92.

7

PO8

Man

age

qual

ity.

2.5

2.4

2.7

2.2

2.2

1.7

2.3

2.2

2.6

2.3

2.0

2.0

2.4

2.1

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

82.

22.

42.

12.

11.

42.

42.

02.

62.

12.

11.

62.

41.

9

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

3.1

2.3

2.8

1.8

2.5

1.6

2.7

1.8

3.1

2.1

2.8

1.9

2.8

1.9

Yes

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

33.

13.

22.

82.

62.

62.

92.

53.

22.

83.

12.

33.

02.

7

AI1

Iden

tify

auto

mat

ed s

olut

ions

.3.

22.

72.

72.

62.

31.

92.

52.

32.

72.

62.

42.

12.

62.

4

IT Governance and Process Maturity

90 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

8—M

atur

ity L

evel

for A

ttrib

utes

by

Annu

al E

xpen

ditu

re in

IT a

s a

Perc

ent o

f Rev

enue

(con

t.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.6

3.2

3.2

2.9

2.7

2.3

3.0

2.7

3.0

3.2

2.8

2.6

3.1

2.8

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

3.4

3.2

2.9

2.9

2.2

2.3

2.6

2.7

2.8

3.1

2.3

2.4

2.7

2.8

AI4

Enab

le o

pera

tion

and

use.

3.3

2.8

3.1

2.5

2.6

1.9

2.9

2.3

3.1

2.5

2.6

1.8

2.9

2.3

AI5

Proc

ure

IT re

sour

ces.

3.3

3.4

3.2

3.1

2.6

2.5

2.5

2.7

3.0

3.1

2.5

2.0

2.8

2.8

AI6

Man

age

chan

ges.

3.1

3.3

3.0

3.3

3.0

2.6

2.6

2.6

2.8

3.1

2.3

2.2

2.8

2.9

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.3.

33.

13.

02.

92.

72.

22.

62.

82.

83.

12.

52.

42.

82.

7

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

62.

52.

32.

11.

91.

92.

22.

12.

32.

22.

41.

82.

32.

1

DS2

Man

age

third

-par

ty s

ervi

ces.

2.7

3.0

2.4

2.8

1.8

2.1

2.1

2.5

2.5

3.0

2.1

2.2

2.2

2.6

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.9

2.8

2.4

2.4

2.6

2.3

2.9

2.5

3.1

2.8

2.7

2.1

2.8

2.5

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

73.

32.

63.

12.

32.

72.

82.

72.

83.

02.

42.

42.

62.

9

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

73.

13.

42.

93.

72.

83.

72.

73.

53.

02.

92.

33.

52.

8Ye

s

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.3.

33.

03.

42.

93.

02.

03.

22.

63.

42.

93.

02.

23.

22.

6Ye

s

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

3.2

3.0

3.3

2.9

3.1

2.3

3.2

2.5

3.3

3.0

3.0

2.2

3.2

2.6

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

53.

23.

63.

13.

83.

03.

42.

93.

53.

13.

22.

43.

53.

0

DS6

Iden

tify

and

allo

cate

cos

ts.

2.9

2.7

2.6

2.4

2.3

2.2

2.5

2.5

2.6

2.9

2.2

1.9

2.5

2.4

DS7

Educ

ate

and

train

use

rs.

2.5

2.7

2.3

2.4

2.5

2.1

2.7

2.6

2.5

2.6

2.3

2.1

2.5

2.4

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

23.

43.

02.

93.

13.

12.

93.

03.

03.

32.

92.

93.

03.

1

DS9

Man

age

the

conf

igur

atio

n.2.

62.

52.

72.

32.

81.

92.

82.

33.

02.

52.

01.

92.

62.

2

DS10

Man

age

prob

lem

s.2.

82.

92.

52.

82.

52.

82.

52.

72.

73.

12.

22.

42.

52.

8

DS11

Man

age

data

.3.

53.

03.

52.

93.

42.

93.

22.

63.

43.

02.

82.

33.

32.

8

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.7

3.6

3.6

3.1

3.2

3.0

3.3

2.9

3.7

3.2

2.7

2.5

3.3

3.0

DS13

Man

age

oper

atio

ns.

3.2

3.0

3.0

2.8

2.5

2.6

2.7

2.7

3.3

3.1

2.6

2.4

2.9

2.8

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.6

2.4

2.3

2.0

1.9

1.6

2.1

1.9

2.5

2.2

2.4

1.8

2.3

2.0

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.7

2.5

2.5

2.4

2.0

1.9

2.1

2.2

2.5

2.3

2.0

1.9

2.3

2.2

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.3

2.9

2.3

2.4

1.9

1.8

2.0

2.2

2.4

2.7

2.0

2.3

2.1

2.4

ME4

Pro

vide

IT g

over

nanc

e.2.

72.

62.

62.

51.

91.

62.

12.

32.

72.

72.

41.

92.

42.

3

Ove

rall

3.0

2.9

2.8

2.6

2.5

2.2

2.7

2.4

2.9

2.7

2.5

2.1

2.7

2.5

Sta

tistic

ally

sig

nific

ant

NoNo

NoNo

NoNo

No

Appendix 2—Details of Maturity Levels

91© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

9—M

atur

ity L

evel

for A

lignm

ent B

etw

een

Busi

ness

and

IT G

oals

by

Attr

ibut

e (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

63.

11.

92.

61.

51.

92.

12.

52.

42.

92.

02.

72.

12.

6Ye

s

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.4

3.1

1.8

2.5

1.7

2.2

2.2

2.4

2.0

2.7

1.4

2.0

1.8

2.5

Yes

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.1.

82.

91.

62.

51.

62.

41.

92.

51.

92.

51.

31.

91.

62.

4Ye

s

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.4

3.3

1.9

2.8

1.3

2.0

2.1

2.8

2.0

3.0

1.6

2.5

1.9

2.7

Yes

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

53.

31.

83.

21.

52.

32.

12.

92.

53.

11.

82.

62.

22.

9Ye

s

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.6

3.1

1.9

2.8

1.3

2.0

1.9

2.6

2.0

3.2

1.6

2.6

1.9

2.7

Yes

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.3

3.6

2.7

3.4

2.8

2.9

2.6

3.1

2.8

3.5

2.4

3.0

2.9

3.3

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

1.9

2.5

1.7

2.1

1.2

1.7

1.4

2.2

1.5

2.5

1.5

2.2

1.5

2.2

Yes

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.2

3.2

1.9

3.0

1.8

2.4

1.9

2.7

1.9

3.0

1.5

2.3

2.0

2.8

Yes

PO7

Man

age

IT h

uman

reso

urce

s.2.

83.

42.

63.

32.

12.

82.

23.

02.

63.

31.

92.

82.

43.

1Ye

s

PO8

Man

age

qual

ity.

1.8

2.9

1.5

2.9

1.4

2.3

1.8

2.6

1.7

2.8

1.3

2.4

1.6

2.7

Yes

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

52.

71.

82.

51.

71.

72.

02.

42.

22.

51.

62.

01.

92.

3

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.1

3.0

2.0

2.5

1.9

2.2

2.1

2.4

2.2

2.8

2.0

2.5

2.2

2.6

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.2.

93.

52.

73.

32.

23.

02.

52.

92.

63.

42.

22.

92.

63.

2Ye

s

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

73.

12.

13.

11.

72.

32.

12.

72.

33.

01.

82.

52.

12.

8Ye

s

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

2.9

3.5

2.7

3.3

2.3

2.6

2.6

3.1

2.9

3.4

2.0

3.0

2.5

3.2

Yes

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.8

3.4

2.4

3.1

2.0

2.4

2.3

2.8

2.5

3.4

1.8

2.7

2.3

3.0

Yes

AI4

Enab

le o

pera

tion

and

use.

2.6

3.1

2.3

2.9

1.9

2.4

2.3

2.7

2.3

3.1

1.9

2.3

2.1

2.7

Yes

AI5

Proc

ure

IT re

sour

ces.

2.8

3.7

2.6

3.5

2.2

2.8

2.3

2.9

2.7

3.4

1.7

2.6

2.5

3.2

Yes

AI6

Man

age

chan

ges.

2.7

3.4

2.3

3.4

2.0

3.0

2.1

2.8

2.2

3.2

1.4

2.6

2.2

3.1

Yes

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.2.

73.

42.

33.

22.

02.

62.

13.

02.

43.

31.

62.

72.

13.

0Ye

s

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

12.

91.

62.

51.

42.

21.

62.

51.

72.

61.

22.

51.

62.

6Ye

s

DS2

Man

age

third

-par

ty s

ervi

ces.

2.6

3.2

2.1

3.0

1.8

2.3

2.3

2.7

2.5

3.2

2.0

2.4

2.0

2.8

Yes

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

2.5

3.0

1.9

2.6

2.1

2.6

2.2

2.8

2.3

3.2

1.8

2.5

2.0

2.8

Yes

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

43.

42.

03.

11.

82.

81.

93.

12.

13.

21.

72.

81.

93.

1Ye

s

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

03.

52.

53.

22.

73.

32.

73.

22.

83.

32.

12.

72.

53.

2Ye

s

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

83.

12.

53.

21.

92.

62.

33.

02.

73.

32.

02.

62.

33.

0Ye

s

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

2.9

3.3

2.4

3.3

2.4

2.8

2.2

3.1

2.7

3.4

1.7

2.8

2.4

3.1

Yes

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

13.

62.

63.

63.

33.

52.

53.

42.

73.

52.

03.

02.

73.

4Ye

s

DS6

Iden

tify

and

allo

cate

cos

ts.

2.7

3.2

2.3

2.8

2.3

2.6

2.3

2.9

2.4

3.2

1.7

2.5

2.4

2.9

DS7

Educ

ate

and

train

use

rs.

2.2

3.1

2.0

3.0

1.8

2.6

2.1

3.1

2.1

3.2

1.8

2.7

2.0

3.0

Yes

IT Governance and Process Maturity

92 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 9

9—M

atur

ity L

evel

for A

lignm

ent B

etw

een

Busi

ness

and

IT G

oals

by

Attr

ibut

e (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

03.

42.

63.

02.

83.

22.

63.

02.

83.

42.

23.

12.

73.

2Ye

s

DS9

Man

age

the

conf

igur

atio

n.2.

32.

92.

02.

62.

02.

42.

02.

82.

23.

01.

42.

22.

02.

7Ye

s

DS10

Man

age

prob

lem

s.2.

53.

02.

03.

02.

02.

92.

12.

92.

23.

21.

52.

72.

02.

9Ye

s

DS11

Man

age

data

.2.

73.

52.

53.

52.

43.

52.

23.

22.

73.

52.

02.

82.

43.

3Ye

s

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.4

3.7

2.9

3.4

2.7

3.3

2.8

3.2

3.1

3.5

2.1

2.8

2.8

3.3

Yes

DS13

Man

age

oper

atio

ns.

2.9

3.3

2.5

3.1

2.5

2.8

2.3

2.9

2.9

3.3

2.0

2.7

2.5

3.0

Yes

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.2

2.9

1.5

2.5

1.4

2.0

1.7

2.4

1.7

2.8

1.4

2.4

1.7

2.5

Yes

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.3

2.7

1.9

2.7

1.7

2.1

1.8

2.4

2.2

2.6

1.8

2.1

1.7

2.5

Yes

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.2

3.0

2.0

2.6

1.5

2.0

1.9

2.3

2.0

2.8

1.8

2.4

1.7

2.5

Yes

ME4

Pro

vide

IT g

over

nanc

e.2.

03.

11.

92.

81.

52.

01.

72.

71.

93.

11.

52.

51.

72.

7Ye

s

Ove

rall

2.6

3.2

2.2

3.0

2.0

2.5

2.1

2.8

2.3

3.1

1.8

2.6

2.1

2.9

Sta

tistic

ally

sig

nific

ant

Yes

Yes

Yes

Yes

Yes

Yes

Yes

Figu

re 1

00—

Mat

urity

Lev

el fo

r Out

sour

cing

Act

iviti

es (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

PO1

Defin

e a

stra

tegi

c IT

pla

n.3.

02.

82.

32.

21.

91.

62.

42.

22.

62.

72.

32.

42.

42.

4

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

3.1

2.5

2.2

2.2

2.0

2.0

2.5

2.2

2.5

2.3

2.1

1.6

2.1

2.1

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

32.

41.

92.

22.

02.

02.

12.

22.

32.

22.

01.

41.

82.

1

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

3.1

2.7

2.4

2.4

1.7

1.7

2.6

2.5

2.7

2.5

2.2

2.0

2.4

2.3

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.3.

22.

82.

62.

62.

41.

72.

72.

43.

22.

62.

42.

12.

82.

4

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

3.2

2.7

2.7

2.3

2.4

1.3

2.6

2.1

2.9

2.6

2.5

2.0

2.5

2.2

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.9

3.3

3.2

3.0

3.3

2.7

3.3

2.7

3.5

3.1

3.4

2.5

3.4

2.8

Yes

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.1

2.4

1.9

2.0

1.8

1.4

1.9

1.9

2.3

2.1

2.3

1.9

1.9

1.9

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.8

2.7

2.6

2.4

2.4

2.0

2.5

2.2

2.6

2.5

2.1

1.9

2.5

2.3

PO7

Man

age

IT h

uman

reso

urce

s.3.

13.

12.

93.

12.

32.

72.

72.

73.

13.

02.

52.

42.

72.

8

PO8

Man

age

qual

ity.

2.4

2.5

2.1

2.4

1.6

2.1

2.2

2.3

2.1

2.5

1.7

2.0

2.0

2.3

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

72.

62.

22.

21.

51.

92.

42.

12.

32.

41.

91.

82.

12.

2

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

3.0

2.5

2.6

2.2

2.3

2.0

2.3

2.3

2.8

2.5

2.3

2.3

2.5

2.4

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

63.

03.

32.

82.

92.

42.

92.

63.

23.

02.

62.

53.

02.

8

AI1

Iden

tify

auto

mat

ed s

olut

ions

.3.

22.

73.

12.

32.

12.

12.

62.

32.

82.

52.

42.

12.

62.

4

Appendix 2—Details of Maturity Levels

93© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 1

00—

Mat

urity

Lev

el fo

r Out

sour

cing

Act

iviti

es (c

ont.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

sRe

spon

sibi

lity

Goal

sOv

eral

lSt

atis

tical

ly

Sign

ifica

ntLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

ghLo

wHi

gh

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.4

3.2

3.2

2.9

2.6

2.5

2.9

2.8

3.3

3.1

2.7

2.5

2.8

2.8

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

2.8

3.4

2.4

3.0

2.0

2.4

2.5

2.6

2.9

3.0

2.2

2.4

2.4

2.8

Yes

AI4

Enab

le o

pera

tion

and

use.

2.8

2.9

2.3

2.8

2.1

2.1

2.6

2.5

2.6

2.8

2.1

2.1

2.4

2.5

AI5

Proc

ure

IT re

sour

ces.

3.3

3.4

3.2

3.1

2.3

2.7

2.8

2.6

2.9

3.2

2.4

2.2

2.9

2.9

AI6

Man

age

chan

ges.

3.2

3.0

3.0

2.8

2.7

2.4

2.5

2.4

2.8

2.7

2.3

1.9

2.7

2.5

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.3.

23.

02.

72.

82.

62.

22.

92.

53.

02.

92.

52.

02.

62.

6

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

62.

52.

12.

12.

01.

82.

41.

92.

22.

22.

21.

82.

22.

0

DS2

Man

age

third

-par

ty s

ervi

ces.

3.0

2.9

2.6

2.6

2.0

2.1

2.6

2.5

2.8

2.9

2.1

2.3

2.4

2.5

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

3.2

2.6

2.4

2.2

2.6

2.2

2.8

2.4

3.0

2.7

2.7

1.9

2.6

2.2

DS4

Ensu

re c

ontin

uous

ser

vice

.3.

32.

72.

62.

62.

42.

32.

82.

42.

82.

72.

52.

22.

62.

4

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

23.

42.

63.

12.

73.

22.

83.

13.

03.

22.

32.

62.

63.

0

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.2.

93.

12.

73.

02.

12.

42.

62.

72.

83.

12.

32.

32.

52.

8

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

3.1

3.1

2.7

3.0

2.5

2.7

2.8

2.7

3.0

3.1

2.2

2.4

2.7

2.8

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

23.

52.

83.

43.

23.

52.

73.

22.

83.

42.

32.

82.

83.

2

DS6

Iden

tify

and

allo

cate

cos

ts.

3.5

2.7

3.1

2.2

3.0

2.1

3.2

2.3

3.4

2.5

2.6

1.9

3.1

2.3

Yes

DS7

Educ

ate

and

train

use

rs.

2.7

2.7

2.6

2.5

2.4

2.2

3.0

2.4

2.9

2.5

2.6

2.1

2.5

2.4

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

13.

32.

72.

83.

03.

02.

72.

93.

03.

22.

52.

72.

83.

0

DS9

Man

age

the

conf

igur

atio

n.2.

82.

42.

32.

32.

32.

22.

62.

32.

62.

62.

01.

72.

32.

3

DS10

Man

age

prob

lem

s.2.

72.

92.

52.

62.

62.

52.

52.

62.

72.

82.

22.

22.

42.

6

DS11

Man

age

data

.3.

23.

13.

23.

03.

13.

03.

02.

73.

33.

02.

52.

42.

92.

8

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.5

3.6

2.9

3.3

2.9

3.2

3.1

3.0

3.2

3.4

2.6

2.5

3.0

3.2

DS13

Man

age

oper

atio

ns.

3.3

3.1

2.9

2.9

3.0

2.5

2.9

2.5

3.3

3.1

2.7

2.3

2.9

2.7

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.6

2.5

2.0

2.2

1.7

1.7

2.2

2.0

2.1

2.5

1.9

2.0

2.2

2.1

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.4

2.7

2.2

2.5

1.9

2.0

2.2

2.1

2.2

2.5

1.7

2.1

2.0

2.2

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.5

2.7

2.3

2.4

1.9

1.8

2.2

2.1

2.3

2.6

2.0

2.2

2.1

2.2

ME4

Pro

vide

IT g

over

nanc

e.2.

42.

72.

32.

51.

61.

92.

42.

22.

42.

72.

02.

12.

02.

3

Ave

rage

ratin

g3.

02.

92.

62.

62.

32.

22.

62.

42.

82.

82.

32.

22.

52.

5

Sta

tistic

ally

sig

nific

ant

NoNo

NoNo

NoNo

No

IT Governance and Process Maturity

94 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Fig

ure

101—

Awar

enes

s, P

olic

ies,

Tec

hnol

ogy

and

Skill

s At

trib

utes

: M

atur

ity L

evel

by

IT G

over

nanc

e St

ruct

ure

(con

t.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

s

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

PO1

Defin

e a

stra

tegi

c IT

pla

n.3.

03.

52.

62.

23.

02.

21.

82.

71.

22.

32.

52.

3

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.8

2.7

2.5

2.3

2.0

2.0

2.0

2.6

1.8

2.4

2.6

2.0

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cl

assi

ficat

ion.

2.4

2.4

2.2

2.1

1.8

2.0

2.0

3.5

1.9

2.3

2.2

2.0

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.9

3.4

2.8

2.5

3.3

2.1

1.6

2.0

1.8

2.6

2.3

2.4

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.3.

03.

32.

82.

72.

82.

22.

01.

81.

82.

62.

82.

2

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.9

3.5

2.7

2.5

3.5

2.0

1.8

2.8

1.1

2.3

3.0

2.1

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.5

3.8

3.3

3.1

3.8

2.8

2.8

2.8

3.1

2.9

3.0

2.9

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.4

3.5

1.9

2.0

3.5

1.4

1.5

3.0

1.1

2.0

3.0

1.5

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.9

2.5

2.5

2.6

2.5

2.1

2.2

2.4

1.9

2.3

2.5

2.3

PO7

Man

age

IT h

uman

reso

urce

s.3.

22.

83.

02.

93.

33.

22.

51.

72.

92.

82.

22.

6

PO8

Man

age

qual

ity.

2.5

1.7

2.2

2.4

2.2

2.0

1.9

2.0

1.7

2.3

2.2

2.1

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

72.

72.

52.

22.

61.

91.

72.

51.

72.

32.

52.

0

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.7

2.5

2.8

2.2

3.0

2.6

2.0

2.8

2.4

2.2

3.0

2.5

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

33.

73.

12.

93.

73.

12.

53.

02.

92.

73.

22.

8

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

93.

52.

72.

73.

22.

32.

02.

02.

22.

42.

42.

4

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.3

4.2

2.9

3.1

3.5

2.5

2.4

3.2

2.4

2.9

3.0

2.6

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

3.2

4.0

2.8

2.8

3.8

2.6

2.2

2.8

2.1

2.5

2.9

2.7

AI4

Enab

le o

pera

tion

and

use.

2.9

2.8

2.6

2.7

2.2

2.4

2.1

2.0

2.2

2.5

2.3

2.4

AI5

Proc

ure

IT re

sour

ces.

3.5

3.5

2.9

3.2

3.5

2.7

2.6

3.3

2.4

2.7

2.8

2.4

AI6

Man

age

chan

ges.

3.2

3.4

2.5

3.1

3.4

2.1

2.7

2.6

1.8

2.5

3.1

2.0

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.3.

23.

42.

72.

83.

32.

42.

42.

92.

22.

82.

72.

3

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

63.

52.

12.

22.

71.

72.

02.

01.

42.

22.

01.

8

DS2

Man

age

third

-par

ty s

ervi

ces.

3.0

3.0

2.6

2.7

3.0

2.3

2.2

2.3

1.8

2.6

2.3

2.3

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

3.0

2.5

2.0

2.4

2.1

1.9

2.6

2.6

1.6

2.7

2.1

2.1

DS4

Ensu

re c

ontin

uous

ser

vice

.3.

22.

52.

22.

82.

32.

22.

52.

41.

92.

72.

32.

1

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

33.

53.

12.

83.

83.

03.

04.

03.

03.

03.

32.

9

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.3.

03.

62.

82.

94.

12.

62.

42.

81.

92.

82.

62.

4

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

3.0

3.8

3.2

2.8

3.4

3.0

2.5

3.2

2.7

2.7

3.2

2.5

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

44.

03.

23.

24.

13.

03.

24.

53.

73.

03.

82.

8

DS6

Iden

tify

and

allo

cate

cos

ts.

2.9

4.0

2.9

2.5

3.9

2.4

2.5

3.3

2.2

2.7

3.3

2.2

Appendix 2—Details of Maturity Levels

95© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 1

02—

Resp

onsi

bilit

y, G

oals

and

Ove

rall

Attr

ibut

es:

Mat

urity

Lev

el b

y IT

Gov

erna

nce

Stru

ctur

e (c

ont.)

Proc

esse

s

Resp

onsi

bilit

yGo

als

Over

all

Stat

istic

ally

Si

gnifi

cant

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

Cent

ral

Dece

ntra

lFe

dera

l

PO1

Defin

e a

stra

tegi

c IT

pla

n.2.

73.

02.

72.

43.

22.

12.

42.

32.

2

PO2A

Def

ine

the

info

rmat

ion

arch

itect

ure—

Arch

itect

ure.

2.5

2.5

2.0

1.8

2.5

1.4

2.2

2.0

1.9

PO2D

Def

ine

the

info

rmat

ion

arch

itect

ure—

Data

cla

ssifi

catio

n.2.

31.

52.

11.

62.

31.

62.

02.

21.

9

PO3

Dete

rmin

e te

chno

logi

cal d

irect

ion.

2.6

2.3

2.6

2.0

2.3

2.1

2.3

2.4

2.3

PO4O

Def

ine

the

IT p

roce

sses

—Or

gani

satio

n.2.

93.

02.

72.

23.

02.

02.

62.

82.

3

PO4P

Def

ine

the

IT p

roce

sses

—Pr

oces

ses.

2.8

3.3

2.2

2.2

2.5

1.9

2.4

2.4

2.0

PO5B

Man

age

the

IT in

vest

men

t—Bu

dget

ing.

3.2

4.0

3.1

2.8

3.8

2.3

3.1

3.7

2.9

PO5V

Man

age

the

IT in

vest

men

t—Va

lue

man

agem

ent.

2.2

3.0

1.7

1.9

3.0

1.9

2.0

2.8

1.5

PO6

Com

mun

icat

e m

anag

emen

t aim

s an

d di

rect

ion.

2.5

2.5

2.5

1.9

2.5

2.2

2.4

2.7

2.2

PO7

Man

age

IT h

uman

reso

urce

s.3.

12.

72.

82.

42.

52.

52.

82.

52.

8

PO8

Man

age

qual

ity.

2.4

2.7

2.1

2.0

2.3

1.6

2.2

2.2

1.8

PO9

Asse

ss a

nd m

anag

e IT

risk

s.2.

42.

82.

31.

92.

01.

72.

12.

32.

0

PO10

PG M

anag

e pr

ojec

ts—

Prog

ram

me.

2.6

3.0

2.7

2.3

2.8

2.4

2.4

2.9

2.5

PO10

PJ M

anag

e pr

ojec

ts—

Proj

ects

.3.

03.

23.

32.

52.

72.

72.

83.

22.

9

AI1

Iden

tify

auto

mat

ed s

olut

ions

.2.

62.

42.

82.

12.

42.

32.

52.

72.

3

AI2

Acqu

ire a

nd m

aint

ain

appl

icat

ion

softw

are.

3.2

3.3

2.9

2.6

2.7

2.2

2.9

3.3

2.5

AI3

Acqu

ire a

nd m

aint

ain

tech

nolo

gy in

frast

ruct

ure.

3.0

3.4

2.7

2.4

1.8

2.3

2.7

2.7

2.4

Fig

ure

101—

Awar

enes

s, P

olic

ies,

Tec

hnol

ogy

and

Skill

s At

trib

utes

: M

atur

ity L

evel

by

IT G

over

nanc

e St

ruct

ure

(con

t.)

Proc

esse

s

Awar

enes

sPo

licie

sTe

chno

logy

Skill

s

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

DS7

Educ

ate

and

train

use

rs.

2.8

2.6

2.4

2.7

2.8

2.2

2.3

2.7

2.0

2.7

2.8

2.3

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

32.

83.

02.

83.

02.

53.

13.

52.

62.

92.

32.

7

DS9

Man

age

the

conf

igur

atio

n.2.

72.

52.

32.

42.

62.

12.

32.

52.

12.

52.

12.

1

DS10

Man

age

prob

lem

s.2.

82.

72.

72.

62.

32.

42.

62.

32.

22.

72.

42.

2

DS11

Man

age

data

.3.

33.

82.

73.

23.

52.

63.

23.

52.

42.

93.

52.

3

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.7

3.9

3.2

3.3

3.9

2.6

3.3

3.5

2.2

3.2

3.0

2.4

DS13

Man

age

oper

atio

ns.

3.3

3.0

2.9

3.0

3.2

2.4

2.8

2.9

2.2

2.8

2.7

2.0

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.7

3.3

2.1

2.2

2.8

1.4

1.8

2.3

1.3

2.1

2.8

1.9

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.4

3.2

2.9

2.2

3.0

2.5

2.0

2.7

1.6

2.0

2.5

2.3

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.6

3.4

2.6

2.2

3.4

2.5

1.6

3.5

2.1

1.9

3.2

2.4

ME4

Pro

vide

IT g

over

nanc

e.2.

73.

52.

32.

53.

52.

11.

73.

41.

82.

33.

02.

1

Aver

age

ratin

g3.

03.

22.

72.

73.

12.

32.

32.

82.

12.

62.

72.

3

IT Governance and Process Maturity

96 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Figu

re 1

02—

Resp

onsi

bilit

y, G

oals

and

Ove

rall

Attr

ibut

es:

Mat

urity

Lev

el b

y IT

Gov

erna

nce

Stru

ctur

e (c

ont.)

Proc

esse

s

Resp

onsi

bilit

yGo

als

Over

all

Stat

istic

ally

Si

gnifi

cant

Cent

ral

Dece

ntra

lFe

dera

lCe

ntra

lDe

cent

ral

Fede

ral

Cent

ral

Dece

ntra

lFe

dera

l

AI4

Enab

le o

pera

tion

and

use.

2.8

2.1

2.5

2.1

1.6

2.2

2.5

2.2

2.3

AI5

Proc

ure

IT re

sour

ces.

3.2

2.8

2.9

2.3

2.0

2.1

2.9

3.3

2.6

AI6

Man

age

chan

ges.

2.8

2.9

2.3

2.2

2.9

1.4

2.7

3.0

2.1

Yes

AI7

Inst

all a

nd a

ccre

dit s

olut

ions

and

cha

nges

.3.

03.

02.

72.

22.

91.

82.

72.

92.

2

DS1

Defin

e an

d m

anag

e se

rvic

e le

vels

.2.

32.

31.

92.

02.

31.

62.

22.

31.

7

DS2

Man

age

third

-par

ty s

ervi

ces.

2.9

2.8

2.7

2.3

2.3

2.1

2.5

2.4

2.2

DS3

Man

age

perfo

rman

ce a

nd c

apac

ity.

3.0

2.9

2.1

2.3

2.2

1.8

2.5

2.3

1.9

DS4

Ensu

re c

ontin

uous

ser

vice

.2.

82.

42.

42.

42.

02.

02.

62.

22.

1

DS5N

F En

sure

sys

tem

s se

curit

y—Ne

twor

k an

d fir

ewal

l.3.

03.

33.

32.

52.

52.

42.

83.

32.

7

DS5P

Ens

ure

syst

ems

secu

rity—

Polic

y.3.

03.

52.

92.

42.

82.

12.

73.

22.

3

DS5U

Ens

ure

syst

ems

secu

rity—

User

acc

ess.

3.0

3.6

3.0

2.4

2.7

2.1

2.7

3.3

2.7

Yes

DS5V

Ens

ure

syst

ems

secu

rity—

Viru

s.3.

23.

43.

12.

73.

32.

23.

13.

63.

0

DS6

Iden

tify

and

allo

cate

cos

ts.

2.9

3.3

2.6

2.1

2.8

2.1

2.7

3.4

2.3

DS7

Educ

ate

and

train

use

rs.

2.8

2.9

2.4

2.3

2.8

1.9

2.5

2.8

2.1

DS8

Man

age

serv

ice

desk

and

inci

dent

s.3.

13.

02.

92.

71.

52.

63.

02.

72.

8

DS9

Man

age

the

conf

igur

atio

n.2.

72.

32.

52.

02.

01.

42.

42.

32.

1

DS10

Man

age

prob

lem

s.2.

92.

32.

62.

32.

21.

82.

62.

42.

3

DS11

Man

age

data

.3.

33.

52.

62.

52.

92.

03.

03.

32.

4Ye

s

DS12

Man

age

the

phys

ical

env

ironm

ent.

3.4

4.0

2.9

2.7

2.0

2.0

3.3

2.9

2.5

Yes

DS13

Man

age

oper

atio

ns.

3.2

3.1

2.9

2.5

2.8

2.0

2.9

2.6

2.4

ME1

Mon

itor a

nd e

valu

ate

IT p

erfo

rman

ce.

2.4

2.3

2.1

2.0

2.5

1.7

2.2

2.7

1.8

ME2

Mon

itor a

nd e

valu

ate

inte

rnal

con

trol.

2.2

2.7

2.9

1.8

2.3

2.2

2.0

2.7

2.4

ME3

Ens

ure

com

plia

nce

with

ext

erna

l req

uire

men

ts.

2.3

3.5

2.6

2.1

2.7

2.3

2.0

2.5

2.3

ME4

Pro

vide

IT g

over

nanc

e.2.

63.

32.

52.

03.

52.

02.

22.

62.

0

Ave

rage

ratin

g2.

82.

92.

62.

22.

52.

02.

62.

72.

3

List of Figures

97© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

List of FiguresFigure 1—41 CobiT IT Processes ............................................................................................................................................................. 7Figure 2—Attributes of Process Maturity ................................................................................................................................................. 8Figure 3—Overall Process Maturity by Domain .................................................................................................................................... 10Figure 4—Box Plots for the 34 IT Processes .......................................................................................................................................... 10Figure 5—Summary Heat Map ............................................................................................................................................................... 12Figure 6—Summary of Statistical Analysis ............................................................................................................................................ 14Figure 7—Distribution of Maturity Levels ............................................................................................................................................. 17Figure 8—Generic Maturity Model (CobiT 4.1) ..................................................................................................................................... 19Figure 9—Maturity Attributes: Awareness and Communication .......................................................................................................... 20Figure 10—AI6 Process Maturity Model ................................................................................................................................................ 20Figure 11—Study Sites by Geographic Location .................................................................................................................................... 24Figure 12—Scope of IT Operations ........................................................................................................................................................ 25Figure 13—Number of Processes Captured (n=51) ................................................................................................................................ 25Figure 14—Hardware Used ..................................................................................................................................................................... 25Figure 15—Organisation of the IT Function ........................................................................................................................................... 26Figure 16—IT Governance Frameworks ................................................................................................................................................. 26Figure 17—Software, Services and Hardware Outsourced ..................................................................................................................... 27Figure 18—Overall Process Maturity by Domain .................................................................................................................................. 28Figure 19—Process Maturity for the Plan and Organise (PO) Domain ................................................................................................. 29Figure 20—Process Maturity for the Acquire and Implement (AI) Domain .......................................................................................... 30Figure 21—Process Maturity for the Deliver and Support (DS) Domain .............................................................................................. 31Figure 22—Process Maturity for the Monitor and Evaluate (ME) Domain ........................................................................................... 31Figure 23—Process Attributes for the Plan and Organise (PO) Domain ............................................................................................... 32Figure 24—Process Attributes for the Acquire and Implement (AI) Domain ........................................................................................ 32Figure 25—Process Attributes for the Deliver and Support (DS) Domain ............................................................................................ 33Figure 26—Process Attributes for the Monitor and Evaluate (ME) Domain ......................................................................................... 33Figure 27—Relative Maturity Level for Each Domain and Attribute .................................................................................................... 33Figure 28—Process Maturity for PO1 Define a Strategic IT Plan ......................................................................................................... 34Figure 29—Process Maturity for PO2A Define the Information Architecture—Architecture ............................................................... 36Figure 30—Process Maturity for PO2D Define the Information Architecture—Data classification ..................................................... 36Figure 31—Process Maturity for PO2 Define the Information Architecture Overall ............................................................................. 36Figure 32—Process Maturity for PO3 Determine Technological Direction .......................................................................................... 37Figure 33—Process Maturity for PO4O Define the IT Processes, Organisation and Relationships—Organisation ............................ 38Figure 34—Process Maturity for PO4P Define the IT Processes, Organisation and Relationships—Processes .................................. 38Figure 35—Process Maturity for PO5B Manage the IT Investment—Budgeting ................................................................................... 39Figure 36—Process Maturity for PO5V Manage the IT Investment—Value Management .................................................................... 39Figure 37—Process Maturity for PO6 Communicate Management Aims and Direction ....................................................................... 40Figure 38—Process Maturity for PO7 Manage IT Human Resources ................................................................................................... 41Figure 39—Process Maturity for PO8 Manage Quality ......................................................................................................................... 41Figure 40—Process Maturity for PO9 Assess and Manage IT Risks ...................................................................................................... 42Figure 41—Process Maturity for PO10PG Manage Projects—Programmes ......................................................................................... 43Figure 42—Process Maturity for PO10PJ Manage Projects—Projects ................................................................................................. 43Figure 43—Process Maturity for AI1 Identify Automated Solutions ...................................................................................................... 44Figure 44—Process Maturity for AI2 Acquire and Maintain Application Software .............................................................................. 45Figure 45—Process Maturity for AI3 Acquire and Maintain Technology Infrastructure ...................................................................... 45Figure 46—Process Maturity for AI4 Enable Operation and Use ......................................................................................................... 46Figure 47—Process Maturity for AI5 Procure IT Resources ................................................................................................................. 47Figure 48—Process Maturity for AI6 Manage Changes ........................................................................................................................ 47Figure 49—Process Maturity for AI7 Install and Accredit Solutions and Changes .............................................................................. 48Figure 50—Process Maturity for DS1 Define and Manage Service Levels ........................................................................................... 49Figure 51—Process Maturity for DS2 Manage Third-party Services .................................................................................................... 49Figure 52—Process Maturity for DS3 Manage Performance and Capacity .......................................................................................... 50

IT Governance and Process Maturity

Figure 53—Process Maturity for DS4 Ensure Continuous Service ........................................................................................................ 51Figure 54—Process Maturity for DS5NF Ensure Systems Security—Network and Firewall ................................................................ 52Figure 55—Process Maturity for DS5P Ensure Systems Security—Policy ............................................................................................ 52Figure 56—Process Maturity for DS5U Ensure Systems Security—User Access .................................................................................. 52Figure 57—Process Maturity for DS5V Ensure Systems Security—Virus ............................................................................................. 53Figure 58—Process Maturity for DS6 Identify and Allocate Costs ........................................................................................................ 53Figure 59—Process Maturity for DS7 Educate and Train Users ........................................................................................................... 54Figure 60—Process Maturity for DS8 Manage Service Desk and Incidents .......................................................................................... 54Figure 61—Process Maturity for DS9 Manage the Configuration ......................................................................................................... 55Figure 62—Process Maturity for DS10 Manage Problems .................................................................................................................... 56Figure 63—Process Maturity for DS11 Manage Data ........................................................................................................................... 56Figure 64—Process Maturity for DS12 Manage the Physical Environment .......................................................................................... 57Figure 65—Process Maturity for DS13 Manage Operations ................................................................................................................. 57Figure 66—Process Maturity for ME1 Monitor and Evaluate IT Performance ..................................................................................... 58Figure 67—Process Maturity for ME2 Monitor and Evaluate Internal Control .................................................................................... 59Figure 68—Process Maturity for ME3 Ensure Compliance With External Requirements .................................................................... 59Figure 69—Process Maturity for ME4 Provide IT Governance ............................................................................................................. 60Figure 70—Overall Performance .............................................................................................................................................................60Figure 71—Heat Map of Statistical Significance for 41 Processes .........................................................................................................62Figure 72—Heat Map of Statistical Significance for Six Attributes .......................................................................................................63Figure 73—Statistically Significant Processes by Geographic Location .................................................................................................64Figure 74—Statistically Significant Attributes by Geographic Location ................................................................................................64Figure 75—Development Level of Statistically Significant Processes....................................................................................................65Figure 76—Development Level of Statistically Significant Attributes ...................................................................................................65Figure 77—Statistically Significant Processes by Industry .....................................................................................................................66Figure 78—Statistically Significant Attributes by Industry .....................................................................................................................66Figure 79—Statistically Significant Processes by IT Size .......................................................................................................................66Figure 80—Statistically Significant Attributes by IT Size .......................................................................................................................67Figure 81—Statistically Significant Processes by IT Spending ...............................................................................................................67Figure 82—Statistically Significant Processes by Business and IT Alignment ......................................................................................68Figure 83—Statistically Significant Attributes by Business and IT Alignment ......................................................................................69Figure 84—Statistically Significant Processes by Level of Outsourcing ................................................................................................69Figure 85—Statistically Significant Processes by IT Governance Structure ...........................................................................................69Figure 86—Statistically Significant Attributes by IT Governance Structure ..........................................................................................70Figure 87—Summary Heat Map ..............................................................................................................................................................74Figure 88—Distribution of Maturity Levels ............................................................................................................................................75Figure 89—Maturity Attribute Table .......................................................................................................................................................76Figure 90—Average Maturity Levels for the Complete Study Population .............................................................................................77Figure 91—Awareness, Policies and Technology Attributes: Maturity by Geographic Location .........................................................79Figure 92—Skills, Responsibilities and Goals Attributes: Maturity by Geographic Location ..............................................................80Figure 93—Overall Maturity by Geographic Location ............................................................................................................................82Figure 94—Maturity Level for Emerging vs. Developed Countries .......................................................................................................83Figure 95—Awareness, Policies, Technology and Skills Attributes: Maturity Level by Industry ........................................................84Figure 96—Responsibility, Goals and Overall Attributes: Maturity Level by Industry ........................................................................86Figure 97—Maturity Level by Relative Size of IT Organisation ............................................................................................................87Figure 98—Maturity Level for Attributes by Annual Expenditure in IT as a Percent of Revenue ........................................................89Figure 99—Maturity Level for Alignment Between Business and IT Goals by Attribute ......................................................................91Figure 100—Maturity Level for Outsourcing Activities .........................................................................................................................92Figure 101—Awareness, Policies, Technology and Skills Attributes: Maturity Level by IT Governance Structure ............................94Figure 102—Responsibility, Goals and Overall Attributes: Maturity Level by IT Governance Structure ............................................95

98 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

References

ReferencesAddison-Wesley, Managing Technical People: Innovation, Teamwork, and the Software Process, USA, 1997

Ahern, D. M.; A. Clouse; R. Turner; CMMI Distilled: A Practical Introduction to Integrated Process Improvement, 2nd Edition, Addison-Wesley, USA, 2004

Bloem, J.; M. van Doorm; P. Mittal; Making IT Governance Work in a Sarbanes-Oxley World, Wiley, USA, 2005

Caputo, K; CMM Implementation Guide: Choreographing Software Process Improvement, Addison-Wesley, USA, 1998

Chrissis, M. B.; M. Konrad; S. Shrum; CMMI: Guidelines for Process Integration and Product Improvement, 2nd Edition, Addison-Wesley, USA, 2007

Dymond, K. M.; A Guide to the CMM: Understanding the Capability Maturity Model for Software, Process Inc., USA, 1995

Garcia, S.; R. Turner; CMMI Survival Guide: Just Enough Process Improvement, Addison-Wesley, USA, 2007

Humphrey, W. S.; Managing the Software Process, Addison-Wesley Professional, USA, 1989

IT Governance Institute, Board Briefing on IT Governance, 2nd Edition, USA, 2003 (www.itgi.org)

IT Governance Institute, CobiT®4.1, USA, 2007

IT Governance Institute, IT Governance Implementation Guide: Using CobiT® and Val IT, 2nd Edition, IT Governance Institute, Mapping of CMMI® for Development V1.2 With CobiT® 4.0, USA, 2007

IT Governance Institute, Mapping of SEI’s CMM for Software With CobiT® 4.0, USA, 2006

Kasse, T.; Practical Insight into CMMI, Artech House, USA, 2004

Process Transition International, Inc., A Guide to the CMMI: Interpreting the Capability Maturity Model Integration, 2nd Edition, USA, 2007

Raynus, J.; Software Process Improvement with CMM, Artech House, USA, 1999

Van Grembergen, W.; S. De Haes; Implementing Information Technology Governance: Models, Practices and Cases, IGI Publishing, USA, 2008

Weill, P.; J. W. Ross; IT Governance: How Top Performers Manage IT Decision Rights for Superior Results, Harvard Business School Press, USA, 2004

99© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

IT Governance and Process Maturity

100 © 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Other PublicationsMany publications issued by ITGI and ISACA contain detailed assessment questionnaires and work programmes. For further information, please visit www.isaca.org/bookstore or e-mail [email protected].

Assurance Publications

• ITAF™: A Professional Practices Framework for IT Assurance, 2008• Stepping through the IS Audit, 2nd Edition, 2004• Top Business/Technology Survey Results, 2008

ERP Series:• Security, Audit and Control Features Oracle® E-Business Suite: A Technical and Risk Management Reference Guide,

2nd Edition, 2006• Security, Audit and Control Features PeopleSoft®: A Technical and Risk Management Reference Guide, 2nd Edition, 2006• Security, Audit and Control Features SAP®R/3®: A Technical and Risk Management Reference Guide, 2nd Edition, 2005

Specific Environments• Electronic and Digital Signatures: A Global Status Report, 2002• Enterprise Identity Management: Managing Secure and Controllable Access in the Extended Enterprise Environment, 2004• Oracle® Database Security, Audit and Control Features, 2004• OS/390—z/OS: Security, Audit and Control Features, 2003• Peer-to-peer Networking Security and Control, 2003• Risks of Customer Relationship Management: A Security, Control and Audit Approach, 2003• Security Provisioning: Managing Access in Extended Enterprises, 2002 • Virtual Private Network—New Issues for Network Security, 2001

CobiT and Related Publications

• CobiT® 4.1, 2007• CobiT® Control Practices: Guidance to Achieve Control Objectives for Successful IT Governance, 2nd Edition, 2007• CobiT® Security BaselineTM, 2nd Edition, 2007• CobiT® QuickstartTM, 2nd Edition, 2007• IT Assurance Guide: Using CobiT®, 2007• IT Control Objectives for Basel II: The Importance of Governance and Risk Management for Compliance, 2007• IT Control Objectives for Sarbanes-Oxley: The Role of IT in the Design and Implementation of Internal Control Over Financial

Reporting, 2nd Edition, 2006• IT Governance Implementation Guide: Using CobiT® and Val IT, 2nd Edition, 2007

CobiT Mapping Series:• Aligning CobiT® 4.1, ITIL V3 and ISO/IEC 27002 for Business Benefit, 2008• CobiT® Mapping: Mapping of CMMI® for Development V1.2 With CobiT® 4.0• CobiT® Mapping: Mapping of ISO/IEC 17799:2000 With CobiT®, 2nd Edition• CobiT® Mapping: Mapping of ISO/IEC 17799:2005 With CobiT® 4.0• CobiT® Mapping: Mapping ITIL V3 With CobiT® 4.1, 2008• CobiT® Mapping: Mapping of ITIL With CobiT® 4.0• CobiT® Mapping: Mapping of NIST SP800-53 With CobiT® 4.1• CobiT® Mapping: Mapping of PMBOK With CobiT® 4.0• CobiT® Mapping: Mapping of PRINCE2 With CobiT® 4.0• CobiT® Mapping: Mapping of SEI’s CMM for Software With CobiT® 4.0• CobiT® Mapping: Mapping of TOGAF 8.1 With CobiT® 4.0• CobiT® Mapping: Overview of International IT Guidance, 2nd Edition

Other Publications

101© 2 0 0 8 i T G o v e r n a n c e i n s T i T u T e . a l l r i G h T s r e s e r v e d .

Security Publications

• Cybercrime: Incident Response and Digital Forensics, 2005• Information Security Governance: Guidance for Boards of Directors and Executive Management, 2nd Edition, 2006• Information Security Governance: Guidance for Information Security Managers, 2008• Information Security Harmonisation—Classification of Global Guidance, 2005• Managing Information Integrity: Security, Control and Audit Issues, 2004 • Security Awareness: Best Practices to Serve Your Enterprise, 2005• Security Critical Issues, 2005

E-commerce Security Series:• Securing the Network Perimeter, 2002• Business Continuity Planning, 2002• Trading Partner Authentication, Registration and Enrollment, 2000• Public Key Infrastructure, 2001• A Global Status Report, 2000• Enterprise Best Practices, 2000

IT Governance Publications

• Board Briefing on IT Governance, 2nd Edition, 2003• IT Governance Status Report—2008, 2008

IT Governance Domain Practices and Competencies:• Information Risks: Whose Business Are They?, 2005• Optimising Value Creation From IT Investments, 2005• Measuring and Demonstrating the Value of IT, 2005• Governance of Outsourcing, 2005• IT Alignment: Who Is in Charge?, 2005

Val IT: • Enterprise Value: Getting Started With Value Management, 2008• Enterprise Value: Governance of IT Investments: The Val IT Framework 2.0, 2008• Enterprise Value: Governance of IT Investments: The Business Case, 2006

3701 Algonquin Road, Suite 1010

Rolling Meadows, IL 60008 USA

Phone: +1.847.660.5700

Fax: +1.847.253.1443

E-mail: [email protected]

Web site: www.itgi.org