Sausages, Laws and Paradigm Shifts Workshop on Uncertainty Analysis and Management Johns Hopkins University, August 16-18, 1999 Timothy M. Barry United States Environmental Protection Agency Making the Transition from Point Estimates to Probabilistic Risk Assessments
The EPA/NAS Model... Risk Assessment Paradigm Hazard Identification Exposure Assessment Toxicity Assessment Risk Characterization
Marketing a Paradigm Shift... w Probabilistic risk assessment (via Monte Carlo Analysis) has been promoted on the basis of its benefits …. improved decision-making more informative transparent more honest
The best PRA is one that achieves three goals w It should reveal, according to any criterion the decision-maker selects, which of the available decisions performs best. w It should point to priorities for obtaining new information (if time and resources permit), so that the decision-maker can reduce uncertainty and increase the confidence in his choice. w It should spur on the search for new decisions which may outperform any of the ones that the initial QUA compared.
Challenge to the Uncertainty Analyst w The most important challenge facing the analyst is to effectively communicate the insights an uncertainty analysis provides to those who need them.. an appreciation of the overall degree of uncertainty about the conclusions.. an understanding of the sources of uncertainty and which modeling assumptions are critical to the analysis and which are not.. an understanding of the extent to which plausible alternative assumptions can affect the conclusions
However, … It is the perception of many that the driving force behind the push to use Monte Carlo analysis is the widely-held and mostly unproven notion that EPA risk assessments are conservative in the extreme, leading to regulations that are too severe, cost too much, and provide too little benefit.
What does uncertainty mean to EPA decision-makers? w Uncertainty about Adverse Effects how confident are we that there are environmental effects? what is the degree of consensus in the scientific community? w Uncertainties about Exposures are/will significant exposures really occur? what are the "error bands"?
More …. w Uncertainty about the Strength of the Data Where are the data gaps? How significant are these gaps to the overall estimates of risk? Surprise. New, potentially significant information on the horizon?. which direction would the risks move if the data gaps were filled? w How sure are we about the effectiveness of remediation options in reducing exposures and risks?
The truth is … w “An entirely scientific risk assessment is a mirage. There is no single right way to do it.... risk assessment is not and cannot be a wholly scientific undertaking.” “Risk assessment often turns upon details that are inherently unknowable. In general, probabilistic and holistic risk assessments could lead to improved decision- making. Whether such assessments prove to be more defensible than the status quo is harder to say.” Edmund Crouch, et. al.; Report to the Commission on Risk Assessment and Risk Management (October,1995)
So far, our track record could be better... w Leads to Better Decisions? How would we know? What criteria would we use? What are the regulatory decision-maker’s criteria? w Scientific Credibility or Just More Confusing Scientific Debate?
Institutional Barriers w Managers are having a hard time seeing the real value added loss of intuitive connectivity analysis paralysis budgets & staffing issues undercut legal defensibility creates an imposing slate of choices that both risk assessors and risk managers are unaccustomed to making
More Barriers w Can't tell a "good one" from a "bad one" When “experts” don’t agree Used as a “scientific” way to challenge EPA overly-confident about our uncertainty if the regulated community thinks it is good, then it must be bad too easily manipulated difficult to detect & judge effects of manipulations w When should it be used? When shouldn’t it be used?
Even More Barriers … w Poorly framed assessment questions leading to poorly focused analyses w When is it appropriate? w Absence of technical guidance & good examples w Training, Staffing, and Resources w Difficult to document and review analyses as model complexity increases
More Technical Barriers w marginal data & models w questions about the use of default distributions w too much judgment w poor representation in the distribution tails, rare events w variability and uncertainty w site-specific applications
A Bureaucratic Response to a Changing Paradigm w Striking but Predictable Features of Our Response measured pace a need to control progress as well as lack of progress is dependent on a few motivated individuals w Dimensions of Our Response Workshops Committees Guidance documents
Adam Finkel Risk Analysis, Vol. 14, No. 5, pages “...will it take another 10 years or more to pass over the next major hurdle in the evolution of risk management methodology and practice - namely, the routine reliance on quantitative uncertainty analysis (QUA) as the lode-star of decision-making rather than as a nicety of risk characterization or as a risk analysis appendage useful only in hindsight?” “However long this advance takes, part of the blame for the delay will rest on the shoulders of practitioners of QUA, who have to date concentrated on getting scientific and regulatory decision-makers to acknowledge the magnitude of the uncertainties facing them and to understand how QUAs are conducted..” “In this we have risked making ourselves akin to mousetrap salesmen who beleaguer the consumer with engineering details before he even understands that, if the gadget works, the result will be a house free of mice.” “...but the fact is that we haven’t stressed nearly enough that it is useful, first and foremost. As a result, our potential “consumers”... have understandable trouble envisioning the precarious position they are in without our “better mousetrap,” and worse yet, they tend to fixate on the downsides - the perceived cost and danger involved in “buying the product””