Download presentation
Presentation is loading. Please wait.
Published byJason Washington Modified over 9 years ago
1
The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Object-based Spatial Verification for Multiple Purposes Beth Ebert 1, Lawrie Rikus 1, Aurel Moise 1, Jun Chen 1,2, and Raghavendra Ashrit 3 1 CAWCR, Melbourne, Australia 2 University of Melbourne, Australia 3 NCMRWF, India www.cawcr.gov.au
2
Object-based spatial verification 2 FORECAST OBSERVATIONS
3
Verifying attributes of objects 3
4
Other examples 4 HIRLAM cloud AVHRR satellite Climate features (SPCZ) Jets in vertical plane Convective initiation Vertical cloud comparison
5
What does an object approach tell us? Errors in Location Size Intensity Orientation Results can Characterize errors for individual forecasts Show systematic errors Give hints as to source(s) of errors I will discuss CRA, MODE, "Blob" Not SAL, Procrustes, Composite (Nachamkin), others 5 OBS FCST
6
6 Contiguous Rain Area (CRA) verification Find Contiguous Rain Areas (CRA) in the fields to be verified –Choose threshold –Take union of forecast and observations –Use minimum number of points and/or total volume of parameter to filter out insignificant CRAs Observed Forecast Define a rectangular search box around CRA to look for best match between forecast and observations Displacement determined by shifting forecast within the box until MSE is minimized or correlation coefficient is maximized Error decomposition MSE total = MSE displacement + MSE intensity + MSE pattern Ebert & McBride, J. Hydrol., 2000
7
Heavy rain over India Met Office global NWP model forecasts for monsoon rainfall, 2007-2012 7 Ashrit et al., WAF, in revision
8
Heavy rain over India 8 CRA threshold: 10 mm/d 20 mm/d 40 mm/d 10 mm/d 20 mm/d 40 mm/d Errors in Day 1 rainfall forecasts
9
Heavy rain over India 9 Error decomposition (%) of Day 1 rainfall forecasts
10
Climate model evaluation 10 Delage and Moise, JGR, 2011 added a rotation component Can global climate models reproduce features such as the South Pacific Convergence Zone?
11
Climate model evaluation "Location error" = MSE displacement + MSE rotation "Shape error" = MSE volume + MSE pattern Applied to 26 CMIP3 models 11 etc.
12
Climate model evaluation Correcting the position of ENSO EOF1 strengthens model agreement on projected changes in spatial patterns of ENSO driven variability in temperature and precipitation 12 Power et al., Nature, 2013
13
13 Method for Object-based Diagnostic Evaluation (MODE) (Davis et al. MWR 2006) Identification Merging Matching Comparison Measure attributes Convolution – threshold process Summarize Fuzzy Logic Approach Compare forecast and observed attributes Merge single objects into clusters Compute interest values* Identify matched pairs Accumulate and examine comparisons across many cases *interest value = weighted combination of attribute matching
14
CRA & MODE – what's the difference? 14 CRAMODE Convolution filterNY Object definitionRain threshold Object mergingNY Matching criterion MSE or correlation coefficient Total interest of weighted attributes Location errorX- and Y- errorCentroid distance Orientation errorYY Rain areaY Y, incl. intersection, union, symmetric area Rain volumeYY Error decompositionYN
15
Comparison for tropical cyclone rainfall 15 CRAMODE Chen, Ebert, Brown (2014) – work in progress
16
Westerly jets "Blob" defined by percentile of local maximum of zonal mean U in reanalysis Y-Z plane 16 5 th percentile10 th percentile15 th percentile Rikus, Clim. Dyn., submitted
17
Westerly jets 17
18
Westerly jets 18 Global reanalyses show consistent behaviour except 20CR. Can be used to evaluate global climate models.
19
Future of object-based verification Routinely applied in operational verification suite Other variables Climate applications 19
20
Future of object-based verification Ensemble prediction – match individual ensemble members 20 8 ensemble members Johnson & Wang, MWR, 2012, 2013 Prob(object)=7/8 Brier skill score Ensemble calibration approaches
21
Future of object-based verification Weather hazards 21 Tropical cyclone structure Pollution cloud, heat anomaly Blizzard extent and intensity Flood inundation Fire spread WWRP High Impact Weather Project
22
Thank you The Centre for Australian Weather and Climate Research A partnership between CSIRO and the Bureau of Meteorology Thank you www.cawcr.gov.au
23
Extra slides 23
24
Abstract Recent years have seen the development of methods for verifying spatially coherent weather "objects" such as rainfall or cloud systems. A strong motivation has been to better assess the performance of high resolution NWP using intuitive approaches that somehow mimic a human's evaluation, and where traditional grid scale verification metrics may sometimes give misleading results. Two of the earliest object-based techniques to be developed were the Contiguous Rain Area method (CRA) and the Method for Object-based Diagnostic Evaluation (MODE). Given a pair of matched forecast and observation grids, these schemes search for contiguous areas of a variable exceeding a threshold (for example, rain greater than 5 mm d -1 ), perform a matching step to associate forecast objects with observed objects, and then compare several attributes of the objects including location, size, intensity, and orientation. This approach quantifies how well a forecast "looks like" the observations and provides hints as to the causes of error. When applied over many cases, object- based verification methods are useful for diagnosing systematic errors. Both the CRA and MODE techniques are now fairly mature and are being applied for a variety of applications. This talk will describe the object-based verification approach, focusing on the CRA method, and demonstrate its use in verifying mid-latitude rain systems, tropical cyclone rainfall, sub- tropical jets, and climate features such as the South Pacific Convergence Zone. Results from these studies are being used both to guide improvements to models, and interpretation of model forecasts and climate projections by users. 24
25
25 Spatial Verification Intercomparison Project Phase 1 – understanding the methods Phase 2 – testing the methods "MesoVICT" – precipitation and rain in complex terrain Deterministic & ensemble forecasts Point and gridded observations including ensemble observations MAP D-PHASE / COPS dataset Core Determ. precip + VERA anal + JDC obs Tier 1 Determ. wind + VERA anal + JDC obs Ensemble precip + VERA anal + JDC obs Ensemble wind + VERA anal + JDC obs Tier 2a Tier 2b Determ. wind + VERA ensemble + JDC obs Determ. precip + VERA ensemble + JDC obs Ensemble wind + VERA ensemble + JDC obs Ensemble precip + VERA ensemble + JDC obs Tier 3 Other variables ensemble + VERA ensemble + JDC obs Sensitivity tests to method parameters
26
MODE – total interest 26 M = number of attributes F i,j = value of object match (0-1) c i,j = confidence, how well a given attribute describes the forecast error w i,j = weight given to an attribute Attributes: centroid distance separation minimum separation distance of object boundaries orientation angle difference area ratio intersection area
27
Tropical cyclone rainfall 27 CRA: Displacement & rotation error Correlation coefficient Volume Median, extreme rain Rain area Error decomposition MODE: Centroid distance & angle difference Total interest Volume Median, extreme rain Intersection / union / symmetric area
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.