Week 5 – Special Topics
Risk Management Best Practices
Traps, Alarms & Escapes From Navy Best Practices Traps - think have risks covered by following procedures Alarms – assumptions that cause trouble Consequences – if nothing is done
Risk Checklist
Risk Management Training
Why RM Training is Needed Weak backgrounds in risk management Mistaken concepts Assessment is RM – skip planning & monitoring Mitigation is only handling strategy All risk can be eliminated Focus on performance, omit cost & schedule
Tailoring RM Training Adjust to different project team groups Sr. management Working level engineers Address issues each group is likely to face Address tailoring RM activities to meet program needs – not one-size fits all
Software Risk Management
SW Risk as a Special Case Virtual Can’t physically touch/feel to assess History of latent defects Many interfaces Hardware-to-software Operating system-to-operating code Incompatible combinations Update frequency Multiple versions Upward/downward compatibility
Taxonomy of SW Risk
Boehm’s Top 10 Software Risks
Identification Strategies Review Schedule & Networks Review Cost Estimation Parameters Perform Interviews Revisit Lessons Learned Develop a Risk taxonomy Brainstorm and play “what if” Re-sort the watch list (e.g., by source)
Identification- Schedule Risks Use the planning or baseline schedule Evaluate the activity network view Look for nodes with: High fan-in (many activities terminate at a single node) High fan-out (many activities emanate from a single node) No predecessors
ID- Cost Risk Drivers Consider specific areas of concern that can lead to problems: Personnel experience, availability Requirement complexity, firmness Scheduling and prediction of task and partition times Hardware requirements, interfaces, constraints
ISO Risk Probability Table
ISO Risk Consequence Table
ISO Risk Contour
SW Risk Handling Avoidance - de-scoping objectives Assumption – latent defects Control – user acceptance testing Transfer – from software to firmware or hardware
SW Metrics
Commercial vs. DoD/NASA Perspective on Risk Management
Commercial vs. Gov’t Perspective Different market conditions Different best practices Different likelihoods for similar issues As always, tailor RM to program needs
Market Differences How is risk impacted?
SW Development Best Practices How is risk impacted?
Risk Category Likelihood
Overview of Risk Management Tools
Cautions in Tool Selection A good tool for one organization may not be a good match for another Tailor RM to the program needs The tool should never dictate the process Define process, then choose compatible tool Be compatible with program culture
Effective Use of a Tool RM is more than using a RM tool Tool must efficiently & effectively integrate into program Resources required Level of detail, complexity Focus of tool – e.g., program phase
RM Data Base Considerations Provide sufficient configuration control Accessible to all team members Ability to accept anonymous comments Support program needs Reporting Monitoring Captures lessons learned Fulfills contractual requirements Balance costs/value
Tools & Crystal Ball – licensed software Monte Carlo simulation add-in for Excel Select desired distribution function & define parameters Provide data and generate a plausible distribution function Provides statistics and graphical output User provides risk analysis structure
Probability-Consequence Screening Developed by the Air Force Risk events assigned a probability & consequence for performance, schedule & cost Position in consequence screening matrix determines risk score User assigns Hi, Med, Low ranges Generates reports and graphical output
Probability-Consequence Screening
Risk Matrix Excel – based model Collects inputs in watch list format Uses best practices (ordinal) breakout for probability & consequence Orders events by Borda rank & assigns risk level Generates action plan reports and graphical output
Risk Matrix
Risk Radar Access – based, licensed software Can establish standard values for risk categorization Manual or automatic risk prioritization Complies with ISO, SEI CMMI & Government standards Generates detailed, summary & metrics reports Demo available:
Risk Radar
TRIMS Technical Risk ID & Mitigation Knowledge-based system Utilizes SEI & Navy Best Practices to collect data on past experiences Measures technical risk rather than cost & schedule Most applicable to design efforts Can tailor categories, templates & questions Generates status, next action & overdue action reports
TRIMS Technical Risk ID & Mitigation
DSM – Design Structure Matrix Knowledge & simulation-based tool Assesses complexity of dependency relationships between project tasks Measures risk in terms of schedule impact Most applicable to design efforts Ongoing development effort at MIT Generates suggested sub-team groupings, & probability curves for task duration ranges
DSM – Design Structure Matrix
Final Exam Closed book, closed notes You have 90 minutes for exam. Any questions? Turn in Part II of your project according to the schedule discussed last week