Decision Making in Complex Organizations Lessons from Columbia Shuttle Disaster case MIIC April 6, 2009 Prof. Morten Hansen
Decision making in complex organizations Decision Making as a Process –Decision: What to do about foam strike? –Deliberations over 8 days, not just one event Complex organizations, complex processes –Multiple units –Hierarchical levels –Pressures –Information and activity overload Different people hold different information Different people hold different views
Decision making when non-routine, ambiguous threats Threats –Apollo 13: Clear threat –Columbia: Ambiguous threat => uncertainty Organizations tend to under-play ambiguous threats –Active discounting of risks –Fragmented, largely disciplined (silo)-based analysis –Wait-and-see orientation to action Response patterns driven by three factors –Human cognition (biases) –Team design and climate –Organization structure and climate
Why did NASA downplay the threat? Explained at three levels Organization Level Structure, culture Cognitive Level Biases, shared frames Team Level Team design, climate
What could NASA have done differently? Exploratory Response Discount riskExaggerate the threat Wait-and-see; Incomplete analysis Learning by doing (probes, testing) Fragmented, expert-siloed, problem solving Top-down, focused, multi-disciplinary Problem solving team Confirmatory responseExploratory response
Managers need to create psychological safety in decision making Psychological Safety: the shared belief that the team/organization is safe for interpersonal risk-taking What types of interpersonal risks are associated with behaviors such as asking for help, admitting an error, or expressing a different point of view? –Risk of looking ignorant –Risk of looking incompetent –Risk of being seen as intrusive –Risk of being seen as negative Psychological safety promotes candid discussion Source: Amy Edmondson, Harvard Business School
What gets in the way?
How to create Psychological Safety Individual conduct of manager is key –Be accessible (meet, open door policy, invite input, etc.) –Acknowledge own fallibility (if leader does, then make others more open to admitting mistakes) Shape the context to make environment safe to speak up –Remove effects of status differences and “expert status” in group, if possible (e.g., by inviting everyone to speak, teams not dominated by certain experts, neutral sites for meetings) –Reduce punishment for failures –Decouple as much as possible discussions aimed at learning from performance evaluation E.g., single mistakes not counting against you