Download presentation
Presentation is loading. Please wait.
Published byBrian Anderson Modified over 9 years ago
1
Forecasting and Decision Making Under Uncertainty Thomas R. Stewart, Ph.D. Center for Policy Research Rockefeller College of Public Affairs and Policy University at Albany State University of New York T.STEWART@ALBANY.EDU Warning Decision Making II Workshop
2
Forecasting and Decision Making Under Uncertainty 2 Outline Uncertainty Decision making and judgment Inevitable error Problem 1: Choosing the warn/no warn cutoff Problem 2: Reducing error by improving forecast accuracy
3
Forecasting and Decision Making Under Uncertainty 3 Uncertainty Uncertainty occurs when, given current knowledge, there are multiple possible states of nature.
4
Forecasting and Decision Making Under Uncertainty 4 Probability is a measure of uncertainty Relative frequency Subjective probability (Bayesian)
5
Forecasting and Decision Making Under Uncertainty 5 Uncertainty 1 - States (events) and probabilities of those events are known – Coin toss – Dice toss – Precipitation forecasting (approximately) Uncertainty
6
Forecasting and Decision Making Under Uncertainty 6 Uncertainty 2 - States (events) are known, probabilities are unknown – Elections – Stock market – Forecasting severe weather Uncertainty
7
Forecasting and Decision Making Under Uncertainty 7 Uncertainty 3 - States (events) and probabilities are unknown – Y2K – Global climate change The differences among the types of uncertainty are a matter of degree. Uncertainty
8
Forecasting and Decision Making Under Uncertainty 8 Picturing uncertainty There are many ways to depict uncertainty. For example, Continuous events: scatterplot Discrete events: decision table
9
Forecasting and Decision Making Under Uncertainty 9 Scatterplot: Correlation =.50
10
Forecasting and Decision Making Under Uncertainty 10 Scatterplot: Correlation =.20
11
Forecasting and Decision Making Under Uncertainty 11 Scatterplot: Correlation =.80
12
Forecasting and Decision Making Under Uncertainty 12 Scatterplot: Correlation = 1.00 The perfect forecast
13
Forecasting and Decision Making Under Uncertainty 13 Base rate = 20/100 =.20 Decision table: Data for an imperfect categorical forecast over 100 days (uncertainty)
14
Forecasting and Decision Making Under Uncertainty 14 Base rate = 20/100 =.20 Decision table terminology: Data for an imperfect categorical forecast over 100 days (uncertainty)
15
Forecasting and Decision Making Under Uncertainty 15 Uncertainty, Judgment, Decision, Error Taylor-Russell diagram – Decision cutoff – Criterion cutoff (linked to base rate) – Correlation (uncertainty) – Errors False positives (false alarms) False negatives (misses)
16
Taylor-Russell diagram Forecasting and Decision Making Under Uncertainty 16
17
Forecasting and Decision Making Under Uncertainty 17 Tradeoff between false positives and false negatives
18
Forecasting and Decision Making Under Uncertainty 18 Uncertainty, Judgment, Decision, Error Another view: ROC analysis – Decision cutoff – False positive proportion – True positive proportion – A z measures forecast quality
19
Forecasting and Decision Making Under Uncertainty 19 ROC Curve
20
Forecasting and Decision Making Under Uncertainty 20 Problem 1: Optimal decision cutoff Given that it is not possible to eliminate both false positives and false negatives, what decision cutoff gives the best compromise? – Depends on values – Depends on uncertainty – Depends on base rate Decision analysis is one optimization method.
21
Forecasting and Decision Making Under Uncertainty 21 Decision tree
22
Forecasting and Decision Making Under Uncertainty 22 Expected value Expected Value = P(O 1 )V(O 1 ) + P(O 2 )V(O 2 ) + P(O 3 )V(O 3 ) + P(O 4 )V(O 4 ) V(O i ) is the value of outcome i P(O i ) is the probability of outcome i
23
Forecasting and Decision Making Under Uncertainty 23 Expected value One of many possible decision making rules Used here for illustration because it’s the basis for decision analysis Intended to illustrate principles
24
Forecasting and Decision Making Under Uncertainty 24 Where do the values come from?
25
Forecasting and Decision Making Under Uncertainty 25 Descriptions of outcomes True positive (hit--a warning is issued and the storm occurs as predicted) – Damage occurs, but people have a chance to prepare. Some property and lives are saved, but probably not all. False positive (false alarm--a warning is issued but no storm occurs) – No damage or lives lost, but people are concerned and prepare unnecessarily, incurring psychological and economic costs. Furthermore, they may not respond to the next warning.
26
Forecasting and Decision Making Under Uncertainty 26 Descriptions of outcomes (cont.) False negative (miss--no warning is issued, but the storm occurs) – People do not have time to prepare and property and lives are lost. NWS is blamed. True negative (no warning is issued and storm occurs) – No damage or lives lost. No unnecessary concern about the storm.
27
Forecasting and Decision Making Under Uncertainty 27 Values depend on your perspective Forecaster Emergency manager Public official Property owner Business owner Many others...
28
Forecasting and Decision Making Under Uncertainty 28 Which is the best outcome? True positive? False positive? False negative? True negative? Give the best outcome a value of 100. Measuring values
29
Forecasting and Decision Making Under Uncertainty 29 Which is the worst outcome? True positive? False positive? False negative? True negative? Give the worst outcome a value of 0. Measuring values
30
Forecasting and Decision Making Under Uncertainty 30 Rate the remaining two outcomes True positive? False positive? False negative? True negative? Rate them relative to the worst (0) and the best (100) Measuring values
31
Forecasting and Decision Making Under Uncertainty 31 Values reflect different perspectives True positive? False positive? False negative? True negative? Perspective 1 2 3 40 50 0 100 90 80 0 100 80 98 0 100 Measuring values
32
Forecasting and Decision Making Under Uncertainty 32 Expected value Expected Value = P(O 1 )V(O 1 ) + P(O 2 )V(O 2 ) + P(O 3 )V(O 3 ) + P(O 4 )V(O 4 ) V(O i ) is the value of outcome i P(O i ) is the probability of outcome i
33
Forecasting and Decision Making Under Uncertainty 33 Expected value depends on the decision cutoff
34
Forecasting and Decision Making Under Uncertainty 34 Expected value depends on the value perspective
35
Forecasting and Decision Making Under Uncertainty 35 Whose values? Forecasting weather is a technical problem. Issuing a warning to the public is a social act. Each warning has an implicit set of values. Should those values be made explicit and subject to public scrutiny?
36
Forecasting and Decision Making Under Uncertainty 36 Problem 2: Improving forecast accuracy Examine the components of forecast skill. This requires a detailed analysis of the forecasting task. Address those components that are problematic, but be aware that solving one problem may create others. Problems are addressed by changing the forecast environment and by training. Training alone has little effect.
37
Forecasting and Decision Making Under Uncertainty 37 Problem 2: Improving forecast accuracy Metatheoretical issue: Correspondence vs. coherence
38
Forecasting and Decision Making Under Uncertainty 38 Coherence research Coherence research measures the quality of judgment against the standards of logic, mathematics, and probability theory. Coherence theory argues that decisions under uncertainty should be coherent, with respect to the principles of probability theory.
39
Forecasting and Decision Making Under Uncertainty 39 Correspondence research Correspondence research measures the quality of judgment against the standards of empirical accuracy. Correspondence theory argues that decisions under uncertainty should result in the least number of errors possible, within the limits imposed by irreducible uncertainty.
40
Forecasting and Decision Making Under Uncertainty 40 Coherence and correspondence theories of competence Coherence theory of competence Uncertainty irrationality error Correspondence theory of competence Uncertainty inaccuracy error What is the relation between coherence and correspondence?
41
Forecasting and Decision Making Under Uncertainty 41 Fundamental tenet of coherence research "Probabilistic thinking is important if people are to understand and cope successfully with real-world uncertainty."
42
Forecasting and Decision Making Under Uncertainty 42 Fundamental tenet of correspondence research "Human competence in making judgments and decisions under uncertainty is impressive. Sometimes performance is not. Why? Because sometimes task conditions degrade the accuracy of judgment." Hammond, K. R. (1996). Human Judgment and Social Policy: Irreducible Uncertainty, Inevitable Error, Unavoidable Injustice. New York, Oxford University Press (p. 282).
43
Forecasting and Decision Making Under Uncertainty 43 Brunswik's lens model Event Forecast X Cues Y e Y s
44
Forecasting and Decision Making Under Uncertainty 44 True Descriptors Subjective Event Forecast Cues Expanded lens model
45
Environmental predictability Fidelity of the information system Match between environment and judge Reliability of information acquisition Reliability of information processing True Descriptors Subjective Event Forecast Cues Forecasting and Decision Making Under Uncertainty 45 Components of skill and the lens model
46
Forecasting and Decision Making Under Uncertainty 46
47
Forecasting and Decision Making Under Uncertainty 47 Decomposition of skill score Skill score =
48
Forecasting and Decision Making Under Uncertainty 48
49
Forecasting and Decision Making Under Uncertainty 49 1. Environmental predictability Environmental predictability is conditional on current knowledge and information. It can be improved through research that results in improved information and improved understanding of environmental processes. Environmental predictability determines an upper bound on forecast performance and therefore indicates how much improvement is possible through attention to other components. Components of skill
50
Forecasting and Decision Making Under Uncertainty 50 Environmental predictability limits accuracy of forecasts
51
Forecasting and Decision Making Under Uncertainty 51 2. Fidelity of information system Forecasting skill may be degraded if the information system that brings data to the forecaster does not accurately represent actual conditions, i.e., if the cues do not accurately measure the true descriptors. Fidelity of the information system refers to the quality, not the quantity, of information about the cues that are currently being used. Fidelity is improved by developing better measures, e.g., though improved instrumentation or increased density in space or time. Components of skill
52
Forecasting and Decision Making Under Uncertainty 52 3. Match between environment and forecaster The match between the model of the forecaster and the environmental model is an estimate of the potential skill that the forecaster's current strategy could achieve if the environment were perfectly predictable (given the cues) and the forecasts were unbiased and perfectly reliable. This component might be called “knowledge.” It is addressed by forecaster training and experience. If the forecaster learns to rely on the most relevant information and ignore irrelevant information, this component will generally be good. Components of skill
53
Forecasting and Decision Making Under Uncertainty 53 Reliability Reliability is high if identical conditions produce identical forecasts. Humans are rarely perfectly reliable. There are two sources of unreliability: – Reliability of information acquisition – Reliability of information processing Components of skill
54
Forecasting and Decision Making Under Uncertainty 54 Reliability Reliability decreases as amount of information increases. Components of skill Theoretical relation between amount of information and accuracy of forecasts
55
Reliability decreases as environmental predictability decreases. Components of skill Forecasting and Decision Making Under Uncertainty 55
56
Forecasting and Decision Making Under Uncertainty 56 4. Reliability of information acquisition Reliability of information acquisition is the extent to which the forecaster can reliably interpret the objective cues. It is improved by organizing and presenting information in a form that clearly emphasizes relevant information. Components of skill
57
Forecasting and Decision Making Under Uncertainty 57 5. Reliability of information processing Decreases with increasing information and with increasing environmental uncertainty Methods for improving reliability of information processing: Limit the amount of information used in judgmental forecasting. Use a small number of very important cues. Use mechanical methods to process information (e.g. MOS). Combine several forecasts (consensus). Require justification of forecasts. Components of skill
58
Forecasting and Decision Making Under Uncertainty 58 Theoretical relation between amount of information and accuracy of forecasts
59
Forecasting and Decision Making Under Uncertainty 59 The relation between information and accuracy depends on environmental uncertainty - - - - - Theoretical limit of accuracy ——— Actual accuracy - - - - - Theoretical limit of accuracy ——— Actual accuracy
60
Forecasting and Decision Making Under Uncertainty 60 6 and 7. Bias -- Conditional (regression bias) and unconditional (base rate bias) Together, the two bias terms measure forecast "calibration” (sometimes called “reliability” in meteorology). Reducing bias: Experience Statistical training Feedback about nature of biases in forecast Search for discrepant information Statistical correction for bias Components of skill
61
Forecasting and Decision Making Under Uncertainty 61 Calibration (a.k.a. reliability) of forecasts depends on the task Calibration data for precipitation forecasts (Murphy and Winkler, 1974) Heideman (1989)
62
Forecasting and Decision Making Under Uncertainty 62 Reading about judgmental forecasting Components of skill – Stewart, T. R., & Lusk, C. M. (1994). Seven components of judgmental forecasting skill: Implications for research and the improvement of forecasts. Journal of Forecasting, 13, 579-599. Principles of Forecasting Project – http://www-marketing.wharton.upenn.edu/forecast/ – Principles of Forecasting: A Handbook for Researchers and Practitioners, J. Scott Armstrong (ed.): Norwell, MA: Kluwer Academic Publishers, (scheduled for publication in 1999). – Stewart, Improving Reliability of Judgmental Forecasts (http://www.albany.edu/cpr/StewartPOF98.PDF)
63
Forecasting and Decision Making Under Uncertainty 63 Conclusion Problem 1: Choosing the warn/no warn cutoff – Value tradeoffs are unavoidable. – Warnings are based on values that should be critically examined. Problem 2: Improving forecast accuracy – Understanding and improving forecasts requires understanding the task and the forecasting environment. – Decomposing skill can aid in identifying the factors that limit forecasting accuracy.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.