Download presentation
Presentation is loading. Please wait.
Published byNorah Cobb Modified over 9 years ago
1
Week 5 – Special Topics
2
Risk Management Best Practices
3
Traps, Alarms & Escapes From Navy Best Practices Traps - think have risks covered by following procedures Alarms – assumptions that cause trouble Consequences – if nothing is done
11
Risk Checklist
12
Risk Management Training
13
Why RM Training is Needed Weak backgrounds in risk management Mistaken concepts Assessment is RM – skip planning & monitoring Mitigation is only handling strategy All risk can be eliminated Focus on performance, omit cost & schedule
14
Tailoring RM Training Adjust to different project team groups Sr. management Working level engineers Address issues each group is likely to face Address tailoring RM activities to meet program needs – not one-size fits all
15
Software Risk Management
16
SW Risk as a Special Case Virtual Can’t physically touch/feel to assess History of latent defects Many interfaces Hardware-to-software Operating system-to-operating code Incompatible combinations Update frequency Multiple versions Upward/downward compatibility
17
Taxonomy of SW Risk
18
Boehm’s Top 10 Software Risks
20
Identification Strategies Review Schedule & Networks Review Cost Estimation Parameters Perform Interviews Revisit Lessons Learned Develop a Risk taxonomy Brainstorm and play “what if” Re-sort the watch list (e.g., by source)
21
Identification- Schedule Risks Use the planning or baseline schedule Evaluate the activity network view Look for nodes with: High fan-in (many activities terminate at a single node) High fan-out (many activities emanate from a single node) No predecessors
22
ID- Cost Risk Drivers Consider specific areas of concern that can lead to problems: Personnel experience, availability Requirement complexity, firmness Scheduling and prediction of task and partition times Hardware requirements, interfaces, constraints
23
ISO Risk Probability Table
24
ISO Risk Consequence Table
25
ISO Risk Contour
26
SW Risk Handling Avoidance - de-scoping objectives Assumption – latent defects Control – user acceptance testing Transfer – from software to firmware or hardware
27
SW Metrics
28
Commercial vs. DoD/NASA Perspective on Risk Management
29
Commercial vs. Gov’t Perspective Different market conditions Different best practices Different likelihoods for similar issues As always, tailor RM to program needs
30
Market Differences How is risk impacted?
31
SW Development Best Practices How is risk impacted?
32
Risk Category Likelihood
33
Overview of Risk Management Tools
34
Cautions in Tool Selection A good tool for one organization may not be a good match for another Tailor RM to the program needs The tool should never dictate the process Define process, then choose compatible tool Be compatible with program culture
35
Effective Use of a Tool RM is more than using a RM tool Tool must efficiently & effectively integrate into program Resources required Level of detail, complexity Focus of tool – e.g., program phase
36
RM Data Base Considerations Provide sufficient configuration control Accessible to all team members Ability to accept anonymous comments Support program needs Reporting Monitoring Captures lessons learned Fulfills contractual requirements Balance costs/value
37
Tools Comparison @Risk & Crystal Ball – licensed software Monte Carlo simulation add-in for Excel Select desired distribution function & define parameters Provide data and generate a plausible distribution function Provides statistics and graphical output User provides risk analysis structure
38
Probability-Consequence Screening Developed by the Air Force Risk events assigned a probability & consequence for performance, schedule & cost Position in consequence screening matrix determines risk score User assigns Hi, Med, Low ranges Generates reports and graphical output www.pmcop.dau.mil
39
Probability-Consequence Screening
41
Risk Matrix Excel – based model Collects inputs in watch list format Uses best practices (ordinal) breakout for probability & consequence Orders events by Borda rank & assigns risk level Generates action plan reports and graphical output www.mitre.org
42
Risk Matrix
44
Risk Radar Access – based, licensed software Can establish standard values for risk categorization Manual or automatic risk prioritization Complies with ISO, SEI CMMI & Government standards Generates detailed, summary & metrics reports Demo available: www.iceincusa.com/products_tools.htm
45
Risk Radar
48
TRIMS Technical Risk ID & Mitigation Knowledge-based system Utilizes SEI & Navy Best Practices to collect data on past experiences Measures technical risk rather than cost & schedule Most applicable to design efforts Can tailor categories, templates & questions Generates status, next action & overdue action reports www.bmpcoe.org
49
TRIMS Technical Risk ID & Mitigation
51
DSM – Design Structure Matrix Knowledge & simulation-based tool Assesses complexity of dependency relationships between project tasks Measures risk in terms of schedule impact Most applicable to design efforts Ongoing development effort at MIT Generates suggested sub-team groupings, & probability curves for task duration ranges www.dsmweb.org
52
DSM – Design Structure Matrix
55
Final Exam Closed book, closed notes You have 90 minutes for exam. Any questions? Turn in Part II of your project according to the schedule discussed last week
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.