Download presentation
Presentation is loading. Please wait.
1
CS 577b Software Engineering II -- Introduction
6 February 2018 CMMI 1.3 Six Sigma ITIL CS 577b Software Engineering II Supannika Koolmanojwong © USC Center for Software Engineering
2
What is CMMI? C onsultant M oney M aking I nitiative © 2016 USC-CSSE
3
CMMI Models V1.3 CMMI® (Capability Maturity Model® Integration) models are collections of best practices that help organizations to improve their processes © 2016 USC-CSSE [Ref: CMMI]
4
Brief Characterization of “CMMI”
CMMI (Capability Maturity Model Integration) is…. A framework of management, engineering, and support best practices organized into topics, called Process Areas An approach to improving the capability of any of the Process Areas by incrementally applying a set of Generic Practices that enhance the functioning of an individual process Best thought of as a set of “process requirements” that help guide an organization who is defining and implementing processes related to the topics NOT a pre-defined, implementable “as is” process definition © 2016 USC-CSSE [Ref: Garcia 2005]
5
Configuration Management (CM) Causal Analysis and Resolution (CAR)
Process Areas Configuration Management (CM) Causal Analysis and Resolution (CAR) Decision Analysis and Resolution (DAR) Integrated Project Management (IPM) Measurement and Analysis (MA) Organizational Performance Management (OPM) Organizational Process Definition (OPD) Organizational Process Focus (OPF) Organizational Process Performance (OPP) Organizational Training (OT) Process and Product Quality Assurance (PPQA) Product Integration (PI) Project Monitoring and Control (PMC) Project Planning (PP) Quantitative Project Management (QPM) Requirements Development (RD) Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM) Technical Solution (TS) Validation (VAL) Verification (VER) © 2016 USC-CSSE
6
Requirements Management Process Area
Purpose: to manage requirements of the project’s products and product components and to ensure alignment between those requirements and the project’s plans and work products. Specific Goal 1 Requirements are managed and inconsistencies with plans and work products are identified. Specific Practice 1.1 Develop an understanding with the requirements providers on the meaning of the requirements. Subpractices Establish criteria for distinguishing appropriate requirements providers. Establish objective criteria for the evaluation and acceptance of requirements. Analyze requirements to ensure that established criteria are met. Reach an understanding of requirements with requirements providers so that project participants can commit to them. SP 1.2 SP 1.5 Example Work Products 1. Lists of criteria for distinguishing appropriate requirements providers 2. Criteria for evaluation and acceptance of requirements 3. Results of analyses against criteria 4. A set of approved requirements Examples of evaluation and acceptance criteria include the following: Clearly and properly stated Complete Consistent with one another Uniquely identified …………………… © 2016 USC-CSSE
7
CMMI terminologies CMMI ICSM Process Area Practice Specific goal Task
Requirements Management Project Planning System and Software Requirements Dev Practice Life Cycle Planning Practice Specific goal Task Specific practice Step Subpractice Detailed step Work Product Work Product / Artifact / Output A set of approved requirements Agreed Win Conditions © 2016 USC-CSSE
8
Example of a CMMI Process Area
© 2016 USC-CSSE
9
CMMI-DEV CMMI - SVC CMMI - ACQ
Causal Analysis and Resolution (CAR) Configuration Management (CM) Decision Analysis and Resolution (DAR) Measurement and Analysis (MA) Organizational Process Definition (OPD) Organizational Process Focus (OPF) Project Planning (PP) Work Planning (WP) Project Monitoring and Control Work Monitoring and Control (WMC) Project Monitoring and Control (PMC) Integrated Project Management Integrated Work Management (IWM) Integrated Project Management (IPM) Quantitative Project Management Quantitative Work Management (QWM) Quantitative Project Management (QPM) Supplier Agreement Management (SAM) Product Integration (PI) Requirements Development (RD) Technical Solution (TS) Validation (VAL) Verification (VER) Capacity and Availability Management (CAM) Incident Resolution and Prevention (IRP) Service Continuity (SCON) Service Delivery (SD) Service System Development (SSD) Service System Transition (SST) Strategic Service Management (STSM) Agreement Management (AM) Acquisition Requirements Development (ARD) Acquisition Technical Management (ATM) Acquisition Validation (AVAL) Acquisition Verification (AVER) Solicitation and Supplier Agreement Development (SSAD) Organizational Performance Management (OPM) Organizational Process Performance (OPP) Organizational Training (OT) Process and Product Quality Assurance (PPQA) Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM) © 2016 USC-CSSE
10
Low Maturity Organizations High Maturity Organizations
Highly dependent on current practitioners Improvised by practitioners and management Not rigorously followed Results difficult to predict Low visibility into progress and quality Compromise of product functionality and quality to meet schedule Use of new technology is risky A disciplined approach for development and management Defined and continuously improving Supported by management and others Well controlled Supported by measurement Basis for disciplined use of technology Institutionalized © 2016 USC-CSSE [Ref: Rudge]
11
Process Area Information: Specific Goals Generic Goals
Purpose Statement, Introductory Notes, Related Process Areas Specific Goals Specific Practices Example Work Products Subpractices Generic Goals Generic Practices Subpractices Generic Practice Elaborations © 2016 USC-CSSE
12
SGs and # of SGs are different from process area to process area
GGs for every process area are the same © 2016 USC-CSSE
13
Two improvement paths using levels
Capability levels, continuous representation process improvement achievement in individual process areas These levels are a means for incrementally improving the processes corresponding to a given process area. 4 capability levels, numbered 0 through 3. Maturity levels staged representation process improvement achievement across multiple process areas These levels are a means of improving the processes corresponding to a given set of process areas 5 maturity levels, numbered 1 through 5 © 2016 USC-CSSE
14
Using Continuous Representation
When you know the processes that need to be improved Improve the performance of a single process area (the trouble spot) or several process areas Allow an organization to improve different processes at different rates. © 2016 USC-CSSE [Ref: Lazaris]
15
Factors in your decision
Business Factors Mature knowledge of its own business objectives (continuous) Product-line focus; entire organization (staged) Cultural Factors Depend on org’s capability Process-based – Continuous Little experience in process improvement - staged Legacy Continuation from using other model © 2016 USC-CSSE [Ref: Lazaris]
16
Comparison of Capability and Maturity Levels
Continuous Representation Capability Levels Staged Representation Maturity Levels Level 0 Incomplete Level 1 Performed Initial Level 2 Managed Level 3 Defined Level 4 Quantitatively Managed Level 5 Optimizing © 2016 USC-CSSE
17
To achieve a capability level, pick a process area
Capability Level 1: Performed - accomplishes the needed work to produce work products; the specific goals of the process area are satisfied Capability Level 2: Managed - A managed process is a performed process that is planned and executed in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves relevant stakeholders; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. Capability Level 3: Defined - A defined process is a managed process that is tailored from the organization’s set of standard processes according to the organization’s tailoring guidelines; has a maintained process description; and contributes process related experiences to the organizational process assets © 2016 USC-CSSE
18
Capability Levels Capability Level 0: Incomplete
not performed or is partially performed. One or more of the specific goals of the process area are not satisfied no generic goals exist for this level since there is no reason to institutionalize a partially performed process Capability Level 1: Performed accomplishes the needed work to produce work products; the specific goals of the process area are satisfied Although capability level 1 results in important improvements, those improvements can be lost over time if they are not institutionalized © 2016 USC-CSSE [Ref: CMMI]
19
Capability Levels Capability Level 2: Managed
A managed process is a performed process that is planned and executed in accordance with policy; employs skilled people having adequate resources to produce controlled outputs; involves relevant stakeholders; is monitored, controlled, and reviewed; and is evaluated for adherence to its process description. Capability Level 3: Defined A defined process is a managed process that is tailored from the organization’s set of standard processes according to the organization’s tailoring guidelines; has a maintained process description; and contributes process related experiences to the organizational process assets © 2016 USC-CSSE [Ref: CMMI]
20
CMMI Maturity Levels © 2016 USC-CSSE [Ref: Buchholtz 2003]
21
Categories of Process Areas
Category Product Integration (PI) Engineering Requirements Development (RD) Technical Solution (TS) Validation (VAL) Verification (VER) Organizational Process Definition (OPD) Process Management Organizational Process Focus (OPF) Organizational Performance Management (OPM) Organizational Process Performance (OPP) Organizational Training (OT) Integrated Project Management (IPM) Project Management Project Monitoring and Control (PMC) Project Planning (PP) Quantitative Project Management (QPM) Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM) Causal Analysis and Resolution (CAR) Support Configuration Management (CM) Decision Analysis and Resolution (DAR) Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) © 2016 USC-CSSE
22
Process Areas by Maturity Levels
Category Maturity Level Project Monitoring and Control (PMC) Project Management 2 Project Planning (PP) Requirements Management (REQM) Supplier Agreement Management (SAM) Configuration Management (CM) Support Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) Product Integration (PI) Engineering 3 Requirements Development (RD) Technical Solution (TS) Validation (VAL) Verification (VER) Organizational Process Definition (OPD) Process Management Organizational Process Focus (OPF) Organizational Training (OT) Integrated Project Management (IPM) Risk Management (RSKM) Decision Analysis and Resolution (DAR) Organizational Process Performance (OPP) 4 Quantitative Project Management (QPM) Organizational Performance Management (OPM) 5 Causal Analysis and Resolution (CAR) © 2016 USC-CSSE
23
CMMI Process Areas (Staged)
Level Project Management Engineering Support Process Management 5 Optimizing CAR: Causal Analysis and Resolution OPM: Organizational Performance Management 4 Quantitatively Managed QPM: Quantitative Project Management OPP: Organizational Process Performance 3 Defined IPM: Integrated Project Management RSKM: Risk Management RD: Requirements Development TS: Technical Solution PI: Product Integration VER: Verification VAL: Validation DAR: Decision Analysis and Resolution OPF: Organizational Process Focus OPD: Organizational Process Definition OT: Organizational Training 2 Managed PP: Project Planning PMC: Project Monitoring and Control SAM: Supplier Agreement Management REQM: Requirements Management MA: Measurement and Analysis PPQA: Process & Product Quality Assurance CM: Configuration Management 1 Initial © 2016 USC-CSSE [Based on Ref: Rudge]
24
Categories of Process Areas
Category Level Integrated Project Management (IPM) Project Management Advanced - 3 Project Monitoring and Control (PMC) Basic - 2 Project Planning (PP) Quantitative Project Management (QPM) Advanced - 4 Requirements Management (REQM) Risk Management (RSKM) Supplier Agreement Management (SAM) © 2016 USC-CSSE
25
Basic Project Management Category
© 2016 USC-CSSE
26
Advanced Project Management Category
© 2016 USC-CSSE
27
Categories of Process Areas
Category Level Product Integration (PI) Engineering 3 Requirements Development (RD) Technical Solution (TS) Validation (VAL) Verification (VER) © 2016 USC-CSSE
28
Engineering Category © 2016 USC-CSSE
29
Categories of Process Areas
Category Level Causal Analysis and Resolution (CAR) Support Advanced - 5 Configuration Management (CM) Basic - 2 Decision Analysis and Resolution (DAR) Advanced - 3 Measurement and Analysis (MA) Process and Product Quality Assurance (PPQA) © 2016 USC-CSSE
30
Basic Support Category
© 2016 USC-CSSE
31
Advanced Support Category
© 2016 USC-CSSE
32
Categories of Process Areas
Category Level Organizational Process Definition (OPD) Process Management Basic - 3 Organizational Process Focus (OPF) Organizational Performance Management (OPM) Advanced - 5 Organizational Process Performance (OPP) Advanced - 4 Organizational Training (OT) © 2016 USC-CSSE
33
Basic Process Management Category
© 2016 USC-CSSE
34
Advanced Process Management Category
© 2016 USC-CSSE
35
CMMI Appraisal example
Process Area Specific Goals and Specific Practices Read the process area description and assess whether your team process comply with CMMI requirements? If yes, please state an evidence SP 1.1 Understand Requirements SP 1.2 Obtain Commitment to Requirements Subpractices Establish criteria for distinguishing appropriate requirements providers. Establish objective criteria for the evaluation and acceptance of requirements. Lack of evaluation and acceptance criteria often results in inadequate verification, costly rework, or customer rejection. Analyze requirements to ensure that established criteria are met. Reach an understanding of requirements with requirements providers so that project participants can commit to them. Example Work Products Lists of criteria for distinguishing appropriate requirements providers Criteria for evaluation and acceptance of requirements Results of analyses against criteria A set of approved requirements Subpractices Assess the impact of requirements on existing commitments. Negotiate and record commitments. Example Work Products Requirements impact assessments Documented commitments to requirements and requirements changes © 2016 USC-CSSE
36
CMMI Appraisal example
Process Area Specific Goals and Specific Practices Read the process area description and assess whether your team process comply with CMMI requirements? If yes, please state an evidence SP 1.3 Manage Requirements Changes SP 1.4 Maintain Bidirectional Traceability of Requirements Subpractices 1. Document all requirements and requirements changes that are given to or generated by the project. 2. Maintain a requirements change history, including the rationale for changes. Maintaining the change history helps to track requirements volatility. 3. Evaluate the impact of requirement changes from the standpoint of relevant stakeholders. Requirements changes that affect the product architecture can affect many stakeholders. 4. Make requirements and change data available to the project. Example Work Products 1. Requirements change requests 2. Requirements change impact reports 3. Requirements status 4. Requirements database Subpractices Maintain requirements traceability to ensure that the source of lower level (i.e., derived) requirements is documented. Maintain requirements traceability from a requirement to its derived requirements and allocation to work products. Generate a requirements traceability matrix. Example Work Products Requirements traceability matrix Requirements tracking system © 2016 USC-CSSE
37
CMMI Appraisal example
Process Area Specific Goals and Specific Practices Read the process area description and assess whether your team process comply with CMMI requirements? If yes, please state an evidence SP 1.5 Ensure Alignment Between Project Work and Requirements Subpractices Review project plans, activities, and work products for consistency with requirements and changes made to them. Identify the source of the inconsistency (if any). Identify any changes that should be made to plans and work products resulting from changes to the requirements baseline. Initiate any necessary corrective actions. Example Work Products Documentation of inconsistencies between requirements and project plans and work products, including sources and conditions Corrective actions © 2016 USC-CSSE
38
CMMI Appraisal result © 2016 USC-CSSE
39
Now – workshop – CMMI Appraisal
Find a pair Try not to team with your own team member Identify gap analysis between ICSM and given process areas Requirements Development Technical Solution Configuration Management Prepare for presentation Off-campus student, assess Verification process area and complete gap analysis, get template on D2L Resources © 2016 USC-CSSE
40
References [CSSE 2002] USC CSE Annual Research Review 2002
[CMMI]Software Engineering Institute's CMMI website: [CMMIRocks] [CrossTalk 2010] CrossTalk Magazines Jan/Feb 2010 [Garcia 2002] Suz Garcia, Are you prepared for CMMI ? [Garcia 2005] Suzanne Garcia, SEI CMU, Why Should you Care about CMMI? [Garcia 2005b] Suz Garcia, Thoughts on Applying CMMI in small settings [Rudge ]CMMI® : St George or the Dragon?, T. Rudge, Thales © 2016 USC-CSSE
41
Outline Six Sigma Lean Six Sigma ITIL © 2016 USC-CSSE
42
What is Six Sigma? © 2016 USC-CSSE
43
What is six sigma? Sigma - a statistical term that measures how far a given process deviates from perfection. if you can measure how many "defects" you have in a process, you can systematically figure out how to eliminate them and get as close to "zero defects" as possible To achieve Six Sigma, a process must not produce more than 3.4 defects per million opportunities or % perfect. © 2016 USC-CSSE
44
Think about a pizza delivery
Not to deliver later than 25 minutes If achieve 68% of the time, you are running at 1 Sigma if achieve % of the time then you are at 6 Sigma Six sigma measures quality. It measures the Variance and does not rely on the Mean. © 2016 USC-CSSE
45
Examples of the Sigma Scale
In a world at 3 sigma. . . There are 964 U.S. flight cancellations per day. The police make 7 false arrests every 4 minutes. In MA, 5,390 newborns are dropped each year. In one hour, 47,283 international long distance calls are accidentally disconnected. In a world at 6 sigma. . . 1 U.S. flight is cancelled every 3 weeks. There are fewer than 4 false arrests per month. 1 newborn is dropped every 4 years in MA. It would take more than 2 years to see the same number of dropped international calls. © 2016 USC-CSSE
46
What is Six Sigma This is accomplished through the use of two Six Sigma sub-methodologies: DMAIC and DMADV. The Six Sigma DMAIC process (define, measure, analyze, improve, control) is an improvement system for existing processes falling below specification and looking for incremental improvement. The Six Sigma DMADV process (define, measure, analyze, design, verify) is an improvement system used to develop new processes or products at Six Sigma quality levels. Both Six Sigma processes are executed by Six Sigma Green Belts and Six Sigma Black Belts, and are overseen by Six Sigma Master Black Belts. © 2016 USC-CSSE
47
Six Sigma DMAIC DMAIC When To Use DMAIC
Define the project goals and customer (internal and external) deliverables Measure the process to determine current performance Analyze and determine the root cause(s) of the defects Improve the process by eliminating defects Control future process performance When To Use DMAIC The DMAIC methodology, instead of the DMADV methodology, should be used when a product or process is in existence at your company but is not meeting customer specification or is not performing adequately. © 2016 USC-CSSE
48
Six Sigma DMADV DMADV When To Use DMADV
Define the project goals and customer (internal and external) deliverables Measure and determine customer needs and specifications Analyze the process options to meet the customer needs Design (detailed) the process to meet the customer needs Verify the design performance and ability to meet customer needs When To Use DMADV A product or process is not in existence at your company and one needs to be developed The existing product or process exists and has been optimized (using either DMAIC or not) and still doesn't meet the level of customer specification or six sigma level © 2016 USC-CSSE
49
Six Sigma DMAIC Roadmap
Define Phase Measure Phase: A: Analyze Phase Define Customers & Requirements Develop Problem Statement, Goals and Benefits Identify Champion, Process Owner & Team Define Resources Develop Project Plan & Milestones Define Defect, Opportunity, Unit & Metrics Detailed Process Map of Appropriate Areas Develop Data Collection Plan Collect the Data Begin Developing Y=f(x) Relationship Define Performance Objectives Identify Value/Non-Value Added Process Steps Identify Sources of Variation Determine Root Cause(s) Determine Vital Few x's, Y=f(x) Relationship © 2016 USC-CSSE
50
Six Sigma DMAIC Roadmap
I - Improve Phase C - Control Phase Perform Design of Experiments Develop Potential Solutions Define Operating Tolerances of Potential System Assess Failure Modes of Potential Solutions Validate Potential Improvement by Pilot Studies Correct/Re-Evaluate Potential Solution Define and Validate Monitoring and Control System Develop Standards and Procedures Implement Statistical Process Control Determine Process Capability Develop Transfer Plan, Handoff to Process Owner Verify Benefits, Cost Savings/Avoidance, Profit Growth Close Project, Finalize Documentation Communicate to Business, Celebrate © 2016 USC-CSSE
51
A Six Sigma Case Study -Tutorial for IT Call Center
Benchmarking: From data about a number of measures about customer satisfaction and call center technical and business performance. Comparing their company to the benchmark average and to a select best-in-class group. We can find that customer satisfaction with their support services was just average or a bit below. (see the following slides.) © 2016 USC-CSSE
52
Figure 1: Customer Satisfaction for the Company, 2001-2003
© 2016 USC-CSSE
53
Figure 2: Customer Satisfaction for Average Companies, 2001-2003
© 2016 USC-CSSE
54
Figure 3: Customer Satisfaction for Best-in-Class Companies, 2001-2003
© 2016 USC-CSSE
55
A Six Sigma Case Study -Tutorial for IT Call Center
By analyzing the customer satisfaction data, we can find that Customer Satisfaction has positive influence to New Account Growth. © 2016 USC-CSSE
56
A Six Sigma Case Study -Tutorial for IT Call Center
Transfer = Average number of transfers (to different agents and help systems) during a service call. Wait Time = Average wait time during a service call. Service = Average service time during the call (the time spent getting the answer to the question, problem solving advice, etc.). From the regression model, we can find that the longer the wait time, transfer time and service, the lower the customer satisfaction. © 2016 USC-CSSE
57
A Six Sigma Case Study -Tutorial for IT Call Center
From the data that gathered from industry, we found that the call center’s waiting time is lower than industry average; thus, there is space for improvement and it will help reduce the cost and increase customer satisfaction. From the data, we can find that the call center’s cost is higher than average, so this project is doable. © 2016 USC-CSSE
58
DMAIC D1. Project Charter:
Problem Statement: our customer satisfaction ratings are at or below average Business Case: with new business growth from 1 percent to 4 percent (or better) would increase our gross revenues by about $3 million. If we can do this without increasing our support costs per call, we should be able to realize a net gain of at least $2 million." Goal Statement: "Increase the call center's industry-measured customer satisfaction rating from its current level (90th percentile = 75 percent) to the target level (90th percentile = 85 percent) by end of the fourth quarter without increasing support costs." © 2016 USC-CSSE
59
Define Phase D3. High Level Process Map:
The process map will be helpful during the Measure phase, as the project team considers how and where to gather data that will shed light on the root cause of the issues most pertinent to the project's goals. © 2016 USC-CSSE
60
Customer Satisfaction
Measure Phase M1. Refine the Project Y(s) During this step the team considered exactly how the project Y(s) would be defined and measured: Y(s) Measurement Primary Customer Satisfaction 1. By industry standard monthly survey 2. The project will require additional, more frequent, case-by-case customer satisfaction data. A measurement system that tracks with the industry survey will be devised and validated. Secondary Supprt Cost (Per Call) The staff time connected with each call: - Call answering and discussion - Case research - Callback time will be loaded with a distribution of benefits and infrastructure costs to compute overall support cost per call. © 2016 USC-CSSE
61
Measure Phase M2. Define Performance Standards for the Y(s) Measure
Current Baseline Target Primary Customer Satisfaction (Per Collins Industry Assessment) 90th Percentile / 70-80% Satisfied 90th Percentile / 85% Satisfied Secondary Support Cost Per Call 90th Percentile / $40 90th Percentile / $32 © 2016 USC-CSSE
62
Measure Phase M3. Identify Segmentation Factors for Data Collection Plan How is Y naturally segmented What factors may be driving the Y(s)? Y-to-x tree cause-and-effect diagrams cause-and-effect matrices © 2016 USC-CSSE
63
Measure Phase M4. Apply Measurement Systems Analysis (MSA)
Questions Usually Posed for Measurement Systems: © 2016 USC-CSSE
64
Measure Phase M5. Collect the Data: A plan was formulated to gather data from the past year's database. M6. Describe and Display Variation in Current Performance How is the Y Distributed? Variation above and below the chart's control limits suggested that there were "special causes" in play – worth understanding in more detail by the team in the Analyze phase. © 2016 USC-CSSE
65
Analyze Phase A1. Measure Process Capability: Before segmenting the data and "peeling the onion" to look for root causes and drivers, the current performance is compared to standards (established in step M2 of the Measure phase). A2. Refine Improvement Goals: If the capability assessment shows a significant departure from expectations, some adjustment to the project goals may need to be considered. © 2016 USC-CSSE
66
Analyze Phase A3: Identify Significant Data Segments and Patterns:
By segmenting the Y data based on the factors (x's) identified during the Measure phase – the team looks for patterns that shed light on what may be causing or driving the observed Y variation. © 2016 USC-CSSE
67
Analyze Phase A4: Identify (Refined/More Detailed List of) Possible x's Collecting the findings that came out of A3, the team posed strongest in the form of "why" questions: Why do Problems and Changes cost more than other call types? Why are calls processed on Mondays and Fridays more expensive? Why do transfer rates differ by call type? (higher on Problems and Changes, lower on others) Why are wait times higher on Mondays and Fridays and on Week 13 of each quarter? © 2016 USC-CSSE
68
Analyze Phase A5: Identify and Verify the Critical x's
To sort out the real drivers from the "likely suspects" list built in A4, there is generally a shift from graphical analysis to statistical analysis. The figure shows that the influence of callbacks on a call's wait time © 2016 USC-CSSE
69
Analyze Phase A6: Refine the Financial Benefit Forecast
Given the "short list" of the real driving x's, the financial model forecasting "how much improvement?" may need to be adjusted. © 2016 USC-CSSE
70
Driving Xs (from Analyze phase) Solution Alternatives
Improve Phase I1. Identify Solution Alternatives to Address Critical x's: Consider solution alternatives from the possibilities identified earlier and decide which ones are worth pursuing further. Driving Xs (from Analyze phase) Solution Alternatives Staffing + Add staff Mondays and Fridays, reduce staff on Sundays + Develop staffing model + Create on-call list to fill-in for absentees Web Service Percentage + Focus on services that can be done best on the Web + Define and communicate the value prop to customers + Evaluate incentives to move traffic to the Web Transfers and Callbacks + Improve call center processes to reduce transfers and callbacks without impacting customer satisfaction © 2016 USC-CSSE
71
Improve Phase I2. Verify the Relationships Between x's and Y(s)
What are the dynamics connecting the process x's with the critical outputs Use regression analysis to verify the relationships I3. Select and Tune the Solution Using predicted performance and net value, decide what is the best solution alternative. © 2016 USC-CSSE
72
Improve Phase I4. Pilot / Implement Solution:
If possible, pilot the solution to demonstrate results and to verify no unintended side effects. Preparation and deployment steps for putting the pilot solution in place. Measures in place to track results and to detect unintended side effects. Awareness of people issues. Measure and compare the improvement of the solution © 2016 USC-CSSE
73
Control Phase C1. Develop Control Plan
The Control plans addressed two views Management control: It often focus on the Y(s) or outcomes of the process and often some of the x's as well Operational control: It concerned with the x's that are predictive of outcome Y(s). Operational control information included both controllable and "noise" variables Operational control information was provided more frequently than management control information © 2016 USC-CSSE
74
Control Phase C2. Determine Improved Process Capability
Use the same measures from Define and Measure in order to provide comparability and monitor impact in a consistent way. C3. Implement Process Control Create, modify and use data collection systems and output reports or dashboards consistent with the control plan. © 2016 USC-CSSE
75
Control Phase C4. Close Project
Prepare the implementation plan, transfer control to operations, conduct project post-mortem, and archive project results. © 2016 USC-CSSE
76
Outline Six Sigma Lean Six Sigma ITIL © 2016 USC-CSSE
77
Lean Six Sigma Lean + Six Sigma Six Sigma Lean Six Sigma
recognize and eliminate defects and or low profit margins. recognize that variations in analyzing and measuring can hinder or often block the ability to deliver high quality services. Focus on data Need a team of professionals (champion, black / green belts) Lean Six Sigma focus is on maximizing products or perform things faster by removing the wastes seven forms of waste or "muda“ (Defects, overproduction, overprocessing, motion, transportation, inventory and waiting) Six Sigma Quality + Lean Speed © 2016 USC-CSSE
78
Lean Six Sigma Measurement activity of the 6δ DMAIC takes a long time and lots of data Lean 6δ does not ignore measurement, will do as necessary © 2016 USC-CSSE
79
Lean Thinking provides a sharp degree of focus on customer value, and provides mechanisms for rapid improvement Six Sigma is the statistical control and performance prediction capability associated with stable processes © 2016 USC-CSSE [Ref: CrossTalk2010]
80
ITIL - IT Infrastructure Library
Collection of best practices V3 - consists of 5 volumes: Service Strategy Service Design Service Transition Service Operation Continual Service Improvement. © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
81
The Service Lifecycle Service Strategy Service Operation
Asset & Config management Strategy generation Change management Financial management Knowledge Management Service portfolio management Service Operation Problem & Incident management Demand management Service Design Request fulfilment Capacity, Availability, Info Security Management Event & Access management Continual Service Improvement Service level & Supplier Management Service measurement & reporting Service Transition Planning & Support 7-step improvement process Release & Deployment © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
82
How the Lifecycle stages fit together
© 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
83
Service Strategy © 2016 USC-CSSE
84
Service Strategy has four activities
Define the Market Develop the Offerings Develop Strategic Assets Prepare for Execution © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
85
Service Design © 2016 USC-CSSE
86
Service Design How are we going to provide it?
How are we going to build it? How are we going to test it? How are we going to deploy it? Holistic approach to determine the impact of change introduction on the existing services and management processes © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
87
Processes in Service Design
Availability Management Capacity Management ITSCM (disaster recovery) Supplier Management Service Level Management Information Security Management Service Catalogue Management © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
88
Service Catalogue Business Service Catalogue
Business Process A Business Process B Business Process C Business Service Catalogue Service 1 Service 2 Service 3 Service 4 Service 5 Service 6 Technical Service Catalogue Hardware Software Support Applications Databases Capability Keeps service information away from business information Provides accurate and consistent information enabling service-focussed working © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
89
Service Transition © 2016 USC-CSSE
90
Service Transition Build Deployment Testing User acceptance
Bed-in (phased or big bang) © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
91
Good service transition
Set customer expectations Enable release integration Reduce performance variation Document and reduce known errors Minimise risk Ensure proper use of services Some things excluded Swapping failed device Adding new user Installing standard software © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
92
Service Operation © 2016 USC-CSSE
93
Service Operation Maintenance Management
Realises Strategic Objectives and is where the Value is seen © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
94
Processes in Service Operation
Incident Management Problem Management Event Management Request Fulfilment Access Management © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
95
Functions in Service Operation
Service Desk Technical Management IT Operations Management Applications Management © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
96
Continuous Service Improvement
© 2016 USC-CSSE
97
Continual Service Improvement
Focus on Process owners and Service Owners Ensures that service management processes continue to support the business Monitor and enhance Service Level Achievements Plan – do –check – act © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
98
Service Measurement Technology (components, MTBF etc)
Process (KPIs - Critical Success Factors) Service (End-to end, e.g. Customer Satisfaction) Why? Validation – Soundness of decisions Direction – of future activities Justify – provide factual evidence Intervene – when changes or corrections are needed © 2016 USC-CSSE users.ox.ac.uk/~tony/itilv3.ppt
99
Managing Production Systems
Pitfalls and Warnings “Organizations should not be over ambitious when implementing Service Management” (IT Service Management “Little ITIL”). DON’T “IMPLEMENT ITIL” ITIL is a Framework of Best Practices, not a Prescriptive Manual Use what is useful, when and where it is useful. Remember that any IT Process will only work if the participants have the right data in the right place at the right time. Configuration Management Database (CMDB) Remember that the objective is to document and implement repeatable processes in order to make the organization more efficient and/or more responsive to customers. ITIL should never been seen as an end in itself. Fact One: The books themselves are meant to guide; they are not steadfast laws. None of the authors, contributors or publishers ever expected organizations to adhere to every word of their publications. Fact Two: You can't implement ITIL with just processes and technology. You must address the people involved as well. ITIL demands attention to three components: people, process and technology. Changes in process improve efficiency and effectiveness. Changes in technology reduce costs and accelerate responsiveness. But you ultimately have to change people to develop the culture you need to better support the business and optimize availability of critical IT services. (Brian Johnson) © 2016 USC-CSSE (c) Dennis Adams Associates Ltd, 2005
100
CMMI vs ITIL CMMI ITIL Origins CMU
United Kingdom’s Office of Government Commerce Scope maturity model, best practices applied in the development of software codes of best practices, and controlling and managing all aspects of IT related operations Application focused toward software development, maintenance, and product integration broader in scope and provides a framework for IT service management and operations including a hardware life cycle. Structure not a process but a description of effective process characteristics. provides solutions on how to undertake each process area. E.g. how to do reqm mgnt © 2016 USC-CSSE
101
Further Readings ITIL Six sigma
ITIL users.ox.ac.uk/~tony/itilv3.ppt © 2016 USC-CSSE
102
SMC S-012 Standard for Software Development
Define government’s requirements and expectations for contractor performance in defense system acquisitions and technology development © 2016 USC-CSSE SMC : SPACE AND MISSILE SYSTEMS CENTER
103
SMC-S012 outline © 2016 USC-CSSE
104
SMC-S012 outline © 2016 USC-CSSE
105
SMC-S012 outline © 2016 USC-CSSE
106
Back up Slides © 2016 USC-CSSE
107
When Project Planning isn’t done well…
What you’ll see… Poor estimates that lead to cost and schedule overruns Unable to discover deviations from undocumented plans Resources aren’t available/applied when needed Unable to meet commitments Why Should You Care? Because…. Customers don’t trust suppliers who waste their resources -- loss of future business No lessons learned for future projects means making the same mistakes on multiple projects Unhappy customers, employees ,and stockholders means a short life for the business If you fail to plan then you plan to fail! © 2016 USC-CSSE [Ref: Garcia 2005]
108
When Project Monitoring and Control isn’t done well….
What you’ll see Crisis management High rework levels throughout the project Lots of time spent in meetings trying to “discover” project status rather than reporting on it Data needed for management decisions is unavailable when needed Actions that should have been taken early on aren’t discovered until it’s too late Why Should You Care? Because…. If you don’t know what’s going on, corrective action can’t be taken early when it’s least expensive Lack of management insight/oversight makes project results highly unpredictable, even later in the project If your confidence in the status you give to your customer is low, they probably perceive that! © 2016 USC-CSSE [Ref: Garcia 2005]
109
When Requirements Management isn’t done well
What you’ll see: High levels of re-work throughout the project Requirements accepted by staff from any source they deem to be authoritative “Galloping” requirements creep Inability to “prove” that the product meets the approved requirements Why Should You Care? Because…. Lack of agreement among stakeholders as to what are the “real” requirements increases time and cost to complete the Project You’re highly likely to deliver an incorrect or incomplete product Revisiting requirements changes over and over is a waste of resource highly visible to the customer © 2016 USC-CSSE [Ref: Garcia 2005]
110
© 2016 USC-CSSE [Ref: Garcia 2005]
111
© 2016 USC-CSSE [Ref: Garcia 2005]
112
© 2016 USC-CSSE [Ref: Garcia 2005]
113
Top new 8 concepts in CMMI1.3
8. Organizational-level contracts Mentioning of preferred suppliers in SAM 7. Prioritized customer requirements Prioritized customer requirements in RD 6. Lifecycle needs and standards Acknowledging standards e.g. ISO in OPD 5. Customer satisfaction Emphasize the importance of customer satisfaction © 2016 USC-CSSE [Ref: CMMIRocks]
114
Top new 8 concepts in CMMI1.3
4. Causal analysis at low levels of maturity Explicit encouragement of using causal analysis New: QPM-SP 2.3 Perform Root Cause Analysis 3. Teaming concepts IPPD (Integrated Process and Product Development) is gone “Teams” is not addition/optional anymore New: IPC-SP1.6 Establish and maintain teams © 2016 USC-CSSE [Ref: CMMIRocks]
115
Top new 8 concepts in CMMI1.3
Modernized development practices Adding concepts of LOS, product line, release increments, architecture-centric, technology maturation Glossary updates Informative material updates in all three constellations (especially in RD, REQM, VAL, and VER) to bring more balance to functional vs. non-functional requirements (e.g., quality attributes) © 2016 USC-CSSE [Ref: CMMIRocks]
116
Top new 8 concepts in CMMI1.3
Agile interpretive guidance Help those who use Agile methods to interpret CMMI Add introductory notes about agile to the following process areas in CM, PI, PMC, PP, PPQA, RD, REQM, RSKM, TS, and VER. Example: "In agile environments... Teams plan, monitor, and adjust plans in each iteration as often as it takes (e.g., daily). Commitments to plans are demonstrated when tasks are assigned and accepted during iteration planning, user stories are elaborated or estimated, and iterations are populated with tasks from a maintained backlog of work. © 2016 USC-CSSE [Ref: CMMIRocks]
117
CMMI vs ITIL © 2016 USC-CSSE
118
CMMI vs ITIL © 2016 USC-CSSE
119
CMMI vs ITIL © 2016 USC-CSSE
120
CMMI vs ITIL © 2016 USC-CSSE
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.