Download presentation
Published byLynette Morrison Modified over 9 years ago
1
IT Governance Capability Maturity within Government
Vernon John SITA
2
Enterprise Governance
Topics Preamble Brief overview of COBIT Overall COBIT Framework IT Governance Capability Maturity Assessment Framework Assessment Approach Assessment Results Importance and Performance General observations Conclusion Enterprise Governance IT Governance Capability performance management + Risk Management = Optimal delivery of IT services (business value) References: Control Objectives for information and related Technology (COBIT)
3
Objective: Gauge IT Governance capability maturity levels
Preamble Objective: Gauge IT Governance capability maturity levels Board briefing on IT Governance 2nd Edition, ITGI COBIT 4.1 ® Management Guidelines COBIT Implementation Guide IT Governance Implementation Guide, ITGI Maturity Measurement –Fit the Purpose, Then The Method, Guldentops E, ISACA, 2003 IT Governance Capability Maturity Assessment Framework Development of templates (assessment and reports) 4 x National Departments 4 x Provincial Departments 5 x Municipalities 13 government departments were measured This presentation provides insight into: IT Governance Capability Maturity Assessment Framework and assessment approach Measurement outcomes
4
Brief overview of COBIT
A set of accepted best practices for IT management and guidance materials for IT Governance Developed by the Information Systems Audit and Control Association (ISACA) and the IT Governance Institute (ITGI) According to ISACA, “COBIT is an IT governance framework and supporting toolset that allows managers to bridge the gap between control requirements, technical issues and business risks. COBIT enables clear policy development and good practice for IT control throughout organizations. COBIT emphasizes regulatory compliance, helps organizations to increase the value attained from IT, enables alignment and simplifies implementation of the COBIT framework Domains (4) Processes (34) Control Objectives (> 200) Control Test Statements (> 800)
5
Overall COBIT Framework
Business objectives Governance objectives ME1 Monitor and evaluate IT performance. ME2 Monitor and evaluate internal control. ME3 Ensure compliance with external requirements. ME4 Provide IT governance. For achieving PO1 Define a strategic IT plan. PO2 Define the information architecture. PO3 Determine technological direction. PO4 Define the IT processes, organisation and relationships. PO5 Manage the IT investment. PO6 Communicate management aims and direction. PO7 Manage IT human resources. PO8 Manage quality. PO9 Assess and manage IT risks. PO10 Manage projects. Business Processes Monitor and Evaluate Plan and Organise To DS1 Define and manage service levels. DS2 Manage third-party services. DS3 Manage performance and capacity. DS4 Ensure continuous service. DS5 Ensure systems security. DS6 Identify and allocate costs. DS7 Educate and train users. DS8 Manage service desk and incidents. DS9 Manage the configuration. DS10 Manage problems. DS11 Manage data. DS12 Manage the physical environment. DS13 Manage operations. Information Efficiency Confidentiality Effectiveness Integrity Compliance Reliability Availability AI1 Identify automated solutions. AI2 Acquire and maintain application software. AI3 Acquire and maintain technology infrastructure. AI4 Enable operation and use. AI5 Procure IT resources. AI6 Manage changes. AI7 Install and accredit solutions and changes. Provide IT Resources Deliver and Support Acquire and Implement Applications Information Infrastructure People
6
IT Governance Capability Maturity Assessment Framework
Raise awareness Envision Solution Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement COBIT Attributes Assess Current Capability Maturity Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) Importance Maturity Model Determine Target Capability Maturity PO1..POn AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
7
IT Governance Capability Maturity Assessment Framework
Raise awareness Envision Solution Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement COBIT Attributes Assess Current Capability Maturity Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) Importance Maturity Model Determine Target Capability Maturity PO1..POn 1 -Not at all 2 - Can survive without it if need be 3 - Make things easier 4 - Very significant 5 - Critical AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
8
IT Governance Capability Maturity Assessment Framework
Raise awareness Envision Solution Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement COBIT Attributes Assess Current Capability Maturity Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) Importance Maturity Model Determine Target Capability Maturity PO1..POn 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution
9
IT Governance Capability Maturity Assessment Framework
Raise awareness Envision Solution Awareness and Communication Policies, Plans and Procedures Tools and Automation Skills and Expertise Responsibility and Accountability Goal setting and Measurement COBIT Attributes Assess Current Capability Maturity Accountable Responsible Audited Control Weaknesses Technology Used Vulnerabilities (Technology) Importance Maturity Model Determine Target Capability Maturity PO1..POn COBIT 4.1 Maturity Attribute Table AI1…AIn Analyse Gaps and Identify Improvement Initiatives Performance DS1…DSn ME1…MEn Plan Solution Note: Assessment results excluded from this presentation
10
Assessment approach SITA facilitated a two-day work-session with IT representatives During the work-session the following was done Created an awareness of IT Governance and our assessment framework and approach Presented on the 34 COBIT processes and control objectives. Thereafter, the representatives we given an opportunity to: Provide information related to the IT process such as Accountability, Responsibility and whether or not the process has been Audited Rate test statements for control objectives ito Importance and Performance Rate the process maturity attributes per IT process ito how well they perceived that they are currently performing and where they would like to perform. The facilitator probed participants to ensure that they understand the process and control objectives and to support a more informed scoring The ratings were used to calculate the overall maturity levels A sample of evidence was requested by the SITA assessment team from the Department representatives to support ratings provided The assessment outcomes were analysed and initiatives to improve IT governance were identified and prioritised Given the short duration of the exercise the assessment was not done in too low a level of detail, but it was sufficient to provide a sense of the IT Governance maturity level and identify areas for improvement Report
11
Assessment results Importance and Performance Per Domain
Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
12
Assessment results Importance and Performance Per Domain
DS ME Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
13
Assessment results Average Importance and Performance Per Process Per Domain
Legend Importance (Imp) 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance (Perf) 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
14
Very Significant Processes (17) Process with highest Performance (17)
Assessment results Very Significant Processes (17) Process with highest Performance (17) Legend Importance 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
15
Assessment results Very Significant Processes (17)
Process with highest “Differences” (17) Legend Importance 1 - Not at all 2 - Can survive without it (if need be) 3 - Make things easier 4 - Very significant 5 - Critical Performance 1 - Some aspects rarely 2 - Some aspects sometimes 3 - All aspects sometimes 4 - Parts are always done well 5 - All is always done well
16
Overall average The overall average level was between a level 1 and a level 2. According to the COBIT Generic Maturity Model the level 1 and 2 description are as follows “1 Initial/Ad Hoc—There is evidence that the enterprise has recognised that the issues exist and need to be addressed. There are, however, no standardised processes; instead, there are ad-hoc approaches that tend to be applied on an individual or case-by-case basis. The overall approach to management is disorganised. 2 Repeatable but Intuitive—Processes have developed to the stage where similar procedures are followed by different people undertaking the same task. There is no formal training or communication of standard procedures, and responsibility is left to the individual. There is a high degree of reliance on the knowledge of individuals and, therefore, errors are likely. “
17
Observations Participants gave their full cooperation and were well receptive to the final reports The was an awareness of IT Governance at a conceptual level but limited knowledge on the details as stipulated in COBIT or on IT Governance implementation Participants understood the importance of IT Governance and acknowledged that they have a key role to play in the implementation thereof. However, in many instances emphasis was placed more on “operational responsibilities” being a higher priority than on IT Governance type responsibilities. Some participants were not able to effectively indicate who was accountable and responsible for the execution of IT processes Very few had explicit IT Governance and IT Process frameworks Some formal IT policies, processes, procedures or plans have been instituted, however this was not done in the context of an overall IT Governance framework and furthermore there was limited periodic reviews done Some IT processes underwent auditing albeit that some are done on ad hoc basis There are limited tools used in support of executing the IT processes. Desktop productivity tools are primarily used and has limited functionality to support effective and efficient execution of the IT processes Unavailability of funds
18
Conclusion COBIT is a very comprehensive IT Governance framework and there is a need to simplify the implementation of COBIT IT Governance within Government departments, which could be done by: Establishing a “minimum” IT Governance framework Compiling an implementation method for the “minimum” IT Governance framework Compiling and making available e.g. generic policies and process that are aligned to the “minimum” framework and that could be easily adapted Initiating IT Governance practitioner training Conducting periodic assessments
19
Thank You
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.