Download presentation
Presentation is loading. Please wait.
Published byArline Banks Modified over 9 years ago
1
Week 3 – Assessing Risk
2
Risk Analysis Process Technical & systematic process Examine events Focus on causes, not symptoms Determine interrelationships Document impact in terms of probability & consequence
3
Analysis Phase May actually start during identification process Availability of experts As a natural by product of interviewing, etc. Define likelihood or probability ratings Define ratings for severity of consequence Establish assessment matrix
4
RM Execution Phases
5
Assessment Phase Primary objective – Identify & analyze program risks To control the most critical risks Id factors that contribute most to achieving desired results Id factors to use in setting cost, schedule, and performance objectives Problem definition stage
6
Assessment Process Basis of most RM actions Quality of the assessment determines the effectiveness of the RM program Tools are available but no one tool has all the answers Definitizes probability & consequence of potential events
8
Identify Risk Drivers Compile potential risk events Describe in detail to understand significance & causes Are events that have significant impact to program Adverse Consequence Significant Opportunity
9
Address Root Sources of Risk Assess processes vs acceptable best practices Consider cost, schedule & technical impacts Use Willoughby templates in DoD 4245.7 Templates describe an area of risk Specify technical methods for risk reduction Correlate with acquisition phases & milestones Primarily applicable during development
10
Willoughby Templates
11
Risks in Acquisition Process Templates address risks by common DoD program elements Discussion of risk Outline for reducing risk Relates to program phase timeline Industrial process for design, test & production of low risk products
12
Process-Oriented Assessment Program critical technical processes Evaluate program baselines Against current & historical data Critical paths Process constraints Critical inputs from outside program
13
Process Metrics Track process of developing, building & introducing the system Meeting established milestones Variances from baselines Earned value Parametric comparisons Details of critical path, constrained process items Dependencies beyond program scope
14
Product Focused Assessment Address risk by program output Utilize WBS breakout Constraints of master schedule Interfaces with other programs/products Use independently or in conjunction with process analysis techniques
15
Product Metrics Track development of the product Measures of effectiveness & performance Progress in meeting requirements Test & analysis results Ability to produce & deliver Availability of resources Comparison to past experience
16
Areas for Assessment
17
Cost Assessment Use probability distributions Define distribution & range by WBS element Use Monte Carlo simulation to assess & aggregate Utilize expert opinion Address performance and schedule causes of cost risk
18
Ranges & Distributions
19
Distributions
20
Statistics
21
Schedule Assessment Extension of Critical Path Method Define duration ranges & distributions for scheduled activities Use analytical techniques to identify schedule drivers Address external schedule impacts Assess probability & magnitude of an overrun
22
Schedule Assessment
23
Cost-Schedule Containment Chart
24
Modeling & Simulation Physical, mathematical, or logical representation of system or process Implementation of a model over time Use data or expert opinion to select PDF (probability density function) Preferred for assessing cost or schedule risk
25
Which Distribution to Use?
27
More Modeling & Simulation As Virtual Prototyping Replica of a system flow Duplication of a physical product Representation of a process flow May be the only way to verify or validate design or process, or assess risk
28
Before Using any Model Verify Functions as designed Validate Represents the system it models Accredit Is acceptable for the special purpose
29
Exercise By study group, identify how you would verify, validate, and accredit these simulation methods Group 1 – Virtual simulation Physical & electrical system representation Ex: Built-in training Group 2 – Constructive simulation Represents the system & its usage Ex: Mock-up Group 3 – Live simulation Uses real operators & equipment Ex: Operational tests You have __ minutes for this exercise. Be prepared to discuss your results.
30
Establish Rating Criteria From empirical data if possible Else, define rigorous qualitative measures Significance based on expert opinion Polling program & industry experts Variance from best practices Accepted rating definitions
31
Best Practices example Low Little or no anticipated impact Normal mgmt attention should control at acceptable level Medium May cause some impact Special action & attention may be required High Likely to cause significant impact Significant additional action & attention would be required
32
Concurrency Impact Overlap between program phases From combining phases / activities Schedule adequacy Assess with best practices or historical data Overlap in DT&E, Production
33
Developing Measurement Scales Qualitative analysis Ordinal scales Defines a relative relationship Quantitative analysis Numerical methods Calibrated ordinal or cardinal scales May be linear or nonlinear
34
Qualitative Scales Levels defined by experts Criteria used coordinated with PM Early definition avoids bias Reflect relative relationship between risk /consequence levels Absolute value on scale not known Not valid for mathematical manipulation
35
Ordinal scales Generally reflect ranked data Difference between scale values is unknown, and not necessarily constant Misleading if scale is defined numerically Mathematical operations: Are at best meaningless At worst: misleading
36
Quantitative Scales Reflect measurable relationship between risk /consequence levels Cardinal or validated ordinal scales Valid for mathematical manipulation Tendency to use for calculating a ‘risk value’ Empirical data Simulation & decision analysis results
37
Qualitative vs. Quantitative Depends on several factors Information available Nature & phase of program Available resources – personnel, schedule, budget Availability of experts Generally qualitative at first, then quantitative as needed or feasible
38
Probability Ratings Use empirical data if available Otherwise, use expert opinion, etc. Important to know if scales are ordinal
39
Consequence Ratings Define for technical, schedule and cost
40
Assessment Matrix Define overall risk ratings
41
Prioritization
42
Multi-Voting Technique Each team member receives votes equal to ½ the number of risks Team members vote for risk items they think have the highest priority Risks are ranked according to the vote Benefits v. Biases ?
43
Borda Ranking Method Ranks risk by criticality based on identified criteria Rank by impact of consequence Rank on probability of occurrence Borda count used to rank risk by criticality
44
Impact Frequencies
45
Rank Consequence, Probability
46
Determine Borda Count
47
Borda v. Matrix Rank
48
Assessment Documentation Goal: communicate to customer, program management, team Define aggregation criteria Voting method Summary level Process, e.g. WBS break out Area of risk – cost, schedule, performance By criticality
49
Impact & Rating Criteria
50
Risk Aggregation
51
Summarize Rank Frequency
52
Order Risks by Borda Count
53
Aggregating Results Define reporting format that communicates the best Group by phase, product, WBS, … Order by color, Borda count, … Information provided Description, relationship to requirements Action required Risk owner, …
54
Aggregation Results
55
Common Failures in RM Process Definition phase too focused on activities Need detail on motives, timelines, resources Unclear relationships or motives Between organizations, analysis methods, models In identifying sources of uncertainty, risk, consequences Addressing commonality between issues Links, interdependencies Document RM process flow to clarify
56
Next Time: Risk Handling, Monitoring Read: Risk Management Guide section 5.7
57
Project – Part II Submit a paper copy of results with your final exam. 1. Explain how to map risks on your class project into the integrated master schedule you prepared for the mid-term exam and Provide a cost-schedule containment chart with a description of the steps you took along with intermediate calculations you made to construct the cost-schedule containment chart, Or, use MS Project (or similar tool) to calculate optimistic, expected, and pessimistic project cost and schedule estimates. Modifications to your mid-term schedule and task loadings are allowed. (12 pts) 2.Provide a comprehensive trade study for your class project that makes use of program performance measures (technical, cost, and schedule) for decision criteria. Explain how risk considerations were included in your trade study. (12 pts) 3.Provide a risk management process flowchart suitable for presentation that has been tailored for the projected life cycle of your class project. (12 pts) 4.Provide briefing charts (in addition to #3) suitable for orientation and training of project personnel on the procedures in which risk management would be conducted on your project. (14 pts)
58
Mid-term Closed book, closed notes Turn Part I of your project in with your exam paper You have 90 minutes for exam. Any questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.