Presentation is loading. Please wait.

Presentation is loading. Please wait.

Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling.

Similar presentations


Presentation on theme: "Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling."— Presentation transcript:

1 Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling Workshop, Boulder, CO Panel “Panel discussion : Using downscaled data in the real world: Sharing experiences: Part II”, 15 August 2013

2 III. Conduct Planning Evaluations II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability System Analysis, Evaluate Study Questions (related to Resource Management Objectives) Operating Constraints Supply Variability Traditional climate context in planning

3 I. Decision-Makers: “Keep it simple.” II. Climate Information Providers: “Here’s the info… use it wisely.” III. Technical Practitioners (Ushers): “Keep it Manageable.”

4 3) Assess climate change impacts on planning assumptions (e.g., supplies, demands, and or water management constraints). 1) Survey Future Climate Information over the study region Analyses of Various Responses 2.a) Decide whether to cull information, and how… 2.b) Decide how to use retained information… 4) Assess operations and dependent resource responses; characterize uncertainties Flow of Information: General View

5 Box 2.a) Climate Model Contest … everyone wants it, utility unclear Focusing on CA, Brekke et al. (2008) considered “historical” simulations from 17 GCMs, and found similar skill when enough metrics were considered. Focusing globally, Gleckler et al. 2008 and Reichler et al. 2008 found similar results. Focusing on CA, projection distributions didn’t change much when the GCM-skill assessment (Brekke et al. 2008) was used to reduce the set of 17 GCMs to a “better” set of 9 GCMs. Santer et al. PNAS 2009 – results from a global water vapor detection and attribution (D&A) study were largely insensitive to skill-based model weighting. Pierce et al. PNAS 2009 – results from western U.S. D&A study were more sensitive to ensemble size than skill-based model weighting.

6 Box 2.b) Two Method Classes (generally speaking) Period-Change –prevalent among impacts studies –“perturbed historical” at some milestone future Transient –time-evolving view, from past to future –prevalent in the climate science community (“climate projections”)

7 7 1) Used UW CIG HB2860 scenarios (Period-Change + Transient) 2) Selected smaller set of both scenario types Period-Change type was of most interest. Goal was to select set that spans the rest: LW = less warming, MW = more warming D = drier W = wetter C = central MC = minimal change PNW Example: Three Fed agencies (BPA, USACE, Reclamation) adopting consensus scenarios

8 8 Scenarios selected for big-basin change… sub- basin changes didn’t always reflect the big-basin scenarios (e.g., Upper Snake is wetter if 5 of 6 scenarios) Logical Process, but there were surprises

9 Projection-specific approach: individual projections inform change definitions (e.g., black dot choices) Ensemble-informed approach: projections are grouped and their pooled information informs definitions Reclamation 2010 Oklahoma Reservoirs Yield Study

10 … projection-specific approaches can lead to serial monthly impacts that seem questionable … ensemble-informed approaches emphasize “consensus” changes from projections … another concern comes from portrayal of monthly impacts Two projection- specific methods One ensemble- informed method Reclamation 2010 Oklahoma Reservoirs Yield Study

11 Scoping with thought towards decision-making?

12 Most info. generation approaches have been science-centric. Decision-Making Science Application

13 Decision-Making Science Application Would we select the same approach using a decision-centric view?

14 Decision-Support Information Science-centric (What’s credible?) Decision-Centric (What’s relevant?) We can approach from both views…

15 What’s Applicable. What’s Reliable? Global Climate Models’ Simulation Qualities What’s Relevant? System Sensitivities to Climate Changes Practical limitations? Tool Fitness, Project Resources Another way of looking at this…

16 Scoping Questions… What’s relevant? What are the study decisions and level of interest in climate uncertainties? What system metrics influence the study decisions? Which types of climate changes influence these metrics the most? What’s reliable? What types of regional future climate and hydrologic datasets are available? Which climate and/or hydrologic changes are projected well? What future climate/hydrologic assumptions should still be based on history? Practical limitations? What modeling steps are required to assess system metrics? How does climate change influence each modeling step? Which climate change influences are practical to represent?

17 FY13-14 Project: Evaluating the Relevance, Reliability, and Applicability of CMIP5 Climate Projections for Water Resources and Environmental Planning Goal –develop & demonstrate a framework for evaluating information relevance & reliability to guide judgment of applicability Approach –Broadband quality evaluation of CMIP5 (what’s reliable?); serve results on web –System sensitivity analyses (what’s relevant?) –Applicability Pilots (observe process, characterize framework for use elsewhere) Collaborators (& POCs) –Reclamation (Ian Ferguson, iferguson@usbr.gov) –USACE (Jeffrey Arnold) –NOAA ESRL (Mike Alexander) –NOAA CIRES (Jamie Scott)

18 Extras

19 Two Method Classes have emerged… Period-Change –prevalent among impacts studies –“perturbed historical” at some milestone future Transient –time-evolving view, from past to future –prevalent in the climate science community (“climate projections”)

20 Period-Change: Overview Historical climate variability sequence is retained (space and time) “Climate Change” Scenarios are defined for perturbing historical, where a change is diagnosed from a historical period to a future period Studies typically feature an Historical scenario and multiple climate change scenarios in order to reveal impacts uncertainty Several methods are available to define scenarios, differing by: –time (e.g., change in means, change in distributions) –space (e.g., change in regional condition, or change in spatilly disaggregated conditions), and –amount of information (e.g., single climate projection, or many projections)

21 Period-Change: Pros and Cons Pros: –Retains familiar historical variability patterns –Simple frame for exploring system sensitivity –Permits “cautious” sampling of temporal aspects from climate projections (e.g., can be simple like change in annual mean, or complex like change in monthly distribution) Cons: –Less ideal for adaptation planning; climate change timing matters –Diagnosing period “Climate Change” is not obvious (more of a problem for  P than for  T) –(when single-projections inform climate change scenarios) month- to-month changes may seem disorderly or noisy

22 Transient: Overview Historical climate variability sequence not retained (but distribution may be retained through climate projection bias-correction…) “Climate” Projections are selected to define an evolving envelope of climate possibility, representing simulated past to projected future –Monthly or daily time series projections typically used Climate Projections may be developed using various methods, e.g.: –Time series outputs from a GCM simulation (or a GCM-RCM simulation) –… bias-corrected and spatially downscaled translations of these outputs –… stochastically resampled (resequenced) versions of these outputs, reflecting a different frequencies reference (observations, paleoproxies) Studies need to feature a large ensemble of climate projections to adequately portray an envelope of climate possibility through time

23 Transient: Pros and Cons Pros: –Avoids challenges of “Climate Change” diagnosis Not discussed, but a key issue is “multi-decadal varaibility” in projections –Supports “master planning” for CC adaptation schedule of adapations through time, including project triggers Cons: –Projection historical sequences differ from experience –Requires “aggressive” sampling of temporal information from climate projections (frequencies vary by member, and may be questionable) –Information is more complex Requires use of many projections, challenging analytical capacities, and requiring probabilistic discussion of results, evolving through time… requires learning phase

24 III. Conduct Planning Evaluations II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability System Analysis, Evaluate Study Questions (related to Resource Management Objectives) Operating Constraints Supply Variability Legacy climate context for planning assumptions in water resources studies

25 III. Conduct Planning Evaluations We’ve developed ways to blend climate change information into this context. II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability Future Operations Portrayal for OCAP BA (flows, storage, deliveries, etc.) Operating Constraints Supply Variability Runoff watershed simulation Regional T and P Global Climate Projections: Representing various GCMs, forcing  bias-correction, spatial downscaling Delta Flow-Salinity Relationship Constraint on Upstream Operations Global T and P… Sea Level Rise …Stream Water Temperature analyses Regional T Reservoir Operations e.g., Reclamation 2008, Mid-Pacific Region’s Central Valley Project – Operations Criteria and Plan, Biological Assessment

26 III. Conduct Planning Evaluations When using projected climate, future climate & hydrology assumptions typically reflect a blend of observed and projected information. II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability System Analysis, Evaluate Study Questions (related to Resource Management Objectives) Operating Constraints Supply Variability Runoff watershed simulation Regional T and P Global Climate Projections: Representing various GCMs, emissions  bias-correction, spatial downscaling Reclamation 2010 Info: Levi Brekke (lbrekke@usbr.gov), Tom Pruitt (tpruitt@usbr.gov) Runoff Magnitudes

27 III. Conduct Planning Evaluations II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability System Analysis, Evaluate Study Questions (related to Resource Management Objectives) Operating Constraints Supply Variability Paleoclimate Proxies: reconstructed runoff (Q) statistical modeling http://www.usbr.gov/lc/region/programs/strategies.html Runoff Magnitudes Yeartype Spells Runoff … future climate & hydrology assumptions can also be based on blend of observed and paleoclimate information.

28 III. Conduct Planning Evaluations II. Relate to Planning Assumptions I. Choose Climate Context Instrumental Records: observed weather (T and P) and runoff (Q) Demand Variability System Analysis, Evaluate Study Questions (related to Resource Management Objectives) Operating Constraints Supply Variability Paleoclimate Proxies: reconstructed runoff (Q) statistical modeling Instrumental Records: observed weather (T and P) and runoff (Q) watershed simulation Global Climate Projections: Representing various GCMs, emissions  bias-correction, spatial downscaling http://www.usbr.gov/research/docs/2009_Hydrology-DiffClimateBlends.pdf … we can also possible blend all three. (Reclamation 2009, CRWAS 2011, others) Yeartype Spells Regional T and P Runoff Magnitudes Runoff

29 3) Assess climate change impacts on planning assumptions related to water supply and power demands. 1) Survey Future Climate Information over the study region Hydrologic Simulation, Electricity Demand Modeling 2.a) Decide whether to cull information, and how… 2.b) Decide how to use retained information… 4) Assess operations response (Reclamation, USACE, and BPA systems & models) Flow of Information: UW CIG HB 2860 Data … Considered 100+ current projections… Decided to focus on 19 projections… Made two types of Columbia Basin weather and hydrology http://warm.atmos.washington.edu/2860/http://warm.atmos.washington.edu/2860/ 860/

30 3) Assess climate change impacts on planning assumptions related to water supply and power demands. 1) Survey Future Climate Information over the study region Hydrologic Response and Local Power Demand Response 2.a) Decide whether to cull information, and how… 2.b) Decide how to use retained information… 4) Assess operations and dependent resource responses; characterize uncertainties Flow of Information: … used by Fed PNW agencies Considered CIG’s 19 projections… Decided to focus on smaller set… Assessing operations under both types of information… insights for planning applications Decided to use both types of Columbia Basin weather and hydrology… http://www.usbr.gov/pn/programs/climatechange/re ports/index.html


Download ppt "Past Applications, Lessons Learned, Current Thinking Levi Brekke (Reclamation, Research & Development Office) NCPP Quantitative Evaluation of Downscaling."

Similar presentations


Ads by Google