Dynamical Climate Reconstruction Greg Hakim University of Washington Sebastien Dirren, Helga Huntley, Angie Pendergrass David Battisti, Gerard Roe
Plan Motivation: fusing observations & models State estimation theory Results for a simple model Results for a less simple model Optimal networks Plans for the future
Motivation Range of approaches to climate reconstruction. Observations: –time-series analysis; multivariate regression –no link to dynamics Models –spatial and temporal consistency –no link to observations State estimation (this talk) –few attempts thus far –stationary statistics
Goals Test new method Reconstruct last 1-2K years –Unique dataset for climate variability? –E.g. hurricane variability. –E.g. rational regional downscaling (hydro). Test network design ideas –Where to take highest impact new obs?
Medieval warm period temperature anomalies IPCC Chapter 6
Climate variability: a qualitative approach North GRIP δ 18 O (temperature) GISP2 K + (Siberian High) Swedish tree line limit shift Sea surface temperature from planktonic foraminiferals hematite-stained grains in sediment cores (ice rafting) Varve thickness (westerlies) Cave speleotherm isotopes (precipitation) foraminifera Mayewski et al., 2004
Statistical reconstructions “Multivariate statistical calibration of multiproxy network” (Mann et al. 1998) Requires stationary spatial patterns of variability Mann et al. 1998
Paleoclimate modeling IPCC Chapter 6
An attempt at fusion Data Assimilation through Upscaling and Nudging (DATUN) Jones and Widmann 2003 Multivariate regression
Fusion Hierarchy Nudging: no error estimates Statistical interpolation 3DVAR 4DVAR Kalman filters Kalman smoothers fixed stats } } Today’s talk } oper NWP The curse of dimensionality looms large in geoscience
State Estimation Primer
Gaussian Update analysis = background + weighted observations new obs information Kalman gain matrix analysis error covariance ‘<’ background
Gaussian PDFs
Ensemble Kalman Filter Crux: use an ensemble of fully non-linear forecasts to model the statistics of the background (expected value and covariance matrix). Advantages No à priori assumption about covariance; state-dependent corrections. Ensemble forecasts proceed immediately without perturbations.
Summary of Ensemble Kalman Filter (EnKF) Algorithm (1)Ensemble forecast provides background estimate & statistics (B) for new analyses. (2)Ensemble analysis with new observations. (3) Ensemble forecast to arbitrary future time.
Paleo-assimilation dynamical climate reconstruction Observations often time-averaged. –e.g. gauge precip; wind; ice cores. Sparse networks. Issue: –How to combine averaged observations with instantaneous model states?
Issue with Traditional Approach Problem: Conventional Kalman filtering requires covariance relationships between time-averaged observations and instantaneous states. High-frequency noise in the instantaneous states contaminates the update. Solution: Only update the time-averaged state.
Algorithm 1. Time-averaged of background 2. Compute model-estimate of time-av obs 3. Perturbation from time mean 4. Update time-mean with existing EnKF 5. Add updated mean and unmodified perturbations 6. Propagate model states 7. Recycle with the new background states
Model (adapted from Lorenz & Emanuel (1998)): Linear combination of fast & slow processes Illustrative Example Dirren & Hakim (2005) “low-freq.” “high-freq.” - LE ~ a scalar discretized around a latitude circle. - LE has elements of atmos. dynamics: chaotic behavior, linear waves, damping, forcing
RMS instantaneous Instantaneous states have large errors (comparable to climatology) Due to lack of observational constraint (dashed : clim)
RMS all means Obs uncertaintyClimatology uncertainty Improvement Percentage of RMS errors Total state variable Constrains signal at higher freq.than the obs themselves! Averaging time of state variable
Potential Problems 1.Possible covariance contamination 2.Extra observation noise
A less simple model Helga Huntley (U. Delaware) QG “climate model” –Radiative relaxation to assumed temperature field –Mountain in center of domain Truth simulation –Rigorous error calculations –100 observations (50 surface & 50 tropopause) –Gaussian errors –Range of time averages
Snapshot
Correlation Patterns as a Function of averaging time (tau)
Observation Locations
Average Spatial RMS Error
Ensemble used for control
Implications State is well constrained by few, noisy, obs. Forecast error saturates at climatology for tau ~ 30. For longer averaging times, the model adds little. Equally good results can be obtained by assimilating the observations with an ensemble drawn from climatology (no model runs required)!
Observation error Experiments
Changing o (Observation Error) Previously: – o = 0.27 for all . Now: – o ≈ c /3 –(a third of control error).
Observing Network Design Helga Huntley (U. Delaware)
Optimal Observation Locations Rather than use random networks, can we devise a strategy to optimally site new observations? –Yes: choose locations with the largest impact on a metric of interest. –New theory based on ensemble sensitivity (Hakim & Torn 2005; Ancell & Hakim 2007; Torn and Hakim 2007) –Here, metric = projection coefficient for first EOF.
Ensemble Sensitivity Given metric J, find the observation that reduces uncertainy most (ensemble variance). Find a second observation conditional on first. Sketch of theory ( let x denote the state). –Analysis covariance –Changes in metric given changes in state + O( x 2 ) –Metric variance
Sensitivity + State Estimation Estimate variance change for the i’th observation Kalman filter theory gives A i : where Given at each point, find largest value.
Ensemble Sensitivity (cont’d) If H chooses a specific location x i, this all simplifies very nicely: –For the first observation: –For the second observation, given assimilation of the first observation: –Etc.
Ensemble Sensitivity (cont’d) In fact, with some more calculations, one can find a nice recursive formula, which requires the evaluation of just k+3 lines (1 covariance vector + (k+6) entry-wise mults/divs/adds/subs) for the k’th point.
Results for tau = 20 First EOF
Results for tau = 20 The ten most sensitive locations (without accounting for prior assimilations) o = 0.10
Results for tau = 20 The four most sensitive locations, accounting for previously found pts.
Results for tau = 20; o = 0.10 Note the decreasing effect on the variance.
Control Case: No Assimilation Avg error =
100 Random Observation Locations Avg Error - Anal = Fcst =
4 Random Observation Locations Avg Error - Anal = Fcst =
4 Optimal Observation Locations Avg Error - Anal = Fcst =
Summary Avg Error FcstAnal Control obs chosen random random random Assimilating just the 4 chosen locations yields a significant portion of the gain in error reduction in J achieved with 100 obs. Percent of ctr error
Experiment: 15 Chosen Observations For this experiment, take –4 best obs to reduce variability in 1st EOF –4 best obs to reduce variability in 2nd EOF –2 best obs to reduce variability in 3rd EOF –2best obs to reduce variability in 4th EOF –3 best obs to reduce variability in 5th EOF The cut-off for each EOF was chosen as the observation with. All obs conditioal on assimilation of previous obs.
15 Obs: Error in 1st EOF Coeff EOF1Control100R4R4O8O15 total Fcst Anal
15 Obs: Error in 2nd EOF Coeff EOF2Control100R4R4O8O15 total Fcst Anal
15 Observations: RMS Error RMSControl100 R4 R4 O8 O15 O Fcst Anal
Current & Future Plans Angie Pendergrass (UW) modeling on the sphere: SPEEDY – simplified physics – slab ocean simulated precipitation observations ice-core assimilation – annual accumulation – oxygen isotopes
Summary State estimation = fusion of obs & models –Lower errors than obs and model. –Provides best estimate & error estimate Idealized experiments: proof of concept –Averaged obs constrain state estimates. –Optimal observing networks Moving toward real observations –Opportunity for regional downscaling.