Download presentation
Presentation is loading. Please wait.
1
Automated Quality Control of Meteorological Observations for MM5 Model Verification Jeff Baars Cliff Mass University of Washington, Seattle, Washington
2
Why Perform Quality Control (QC) on Observations? The integrity and value of all verification activities depends on the quality of the data. Similarly, as we move to mesoscale initialization for the MM5/WRF quality control of the data is critical. Future bias correction will depend on the quality of the observations. Currently, only a simple range check is performed on in-coming obs.
3
And why do automated QC? Can’t possibly do manual QC on all of the obs we ingest. No automated QC program will be perfect, and some bad values may slip through. Some non-bad values may get flagged also.
4
Description of the Obs Many unknowns about individual networks, and generally there’s a lack of documentation for them. Networks vary in quality : Instrumentation varies, as does accuracy of measurements. Siting varies between networks, including instrument height, surroundings, etc. Calibration standards vary between networks– how many networks perform annual calibration?
9
The Flags No data is thrown out– it is only flagged. Each test has a flag, which can be “pass,” “suspect,” “warning,” and “failure.” Only a range failure can cause a “failure” flag.
10
The Four QC Tests Modeled after Oklahoma Mesonet automated QC procedures (Shafer et al 2000, J of Oceanic Tech). With some differences, due to differences between the Northwest and OK. (Like lots of topography!) Performed on 6 variables: temperature, relative humidity, wind speed and direction, sea level pressure, and 6- and 24- hr precipitation. Range check-- simplest of them all; just a sanity check on the value, similar to what we’re already doing. Step check-- looks for unreasonably large jumps between hour time steps in the data. Persistence check-- looks for a “flat-lined” time series. Spatial check-- checks value in question versus those surrounding it. The most tricky one to get right!
11
Step Test– Flagged Threshold between successive obs Variable“Suspect”“Warning” Temperature (K)1015 Relative Humidity (%)6080 Sea Level Pressure (mb)1015 Wind Speed (m/s)4050
13
Persistence Test Looks at 24 hours (18 hours for temperature) of consecutive data for a variable. Calculates standard deviation over the 24 hours, and if it’s below a set threshold, the entire day is flagged. Also checks the largest delta between successive obs and if it is below a set threshold, the entire day is flagged.
15
Spatial Test Tested multiple spatial tests; settled on the simplest. For each station, looks at all stations within 100-km horizontal distance and within an elevation band (varies by variable). Minimum of 5 stations required for test to be performed. If no station has a value reasonably close to value in question, value is flagged. If not enough stations are found in 100-km radius, test is performed again for a 150-km radius, but with constraints eased.
19
varn flaggedpercent flagged (n/n_all_fla gged) dir10263226.54 pcp243440.09 pcp64260.11 rh10560227.31 slp6250.16 spd13861135.84 temp384799.95
20
testn flaggedpercent flagged (n/n_all_flag ged) persistence35236991.12 range59281.53 spatial224735.81 step59491.54
21
networkn flaggedpercent flagged (n/n_all_flagged) AV58901.52 AW7363619.04 CC13300.34 CM8180.21 CU14980.39 DT301047.78 EC8650.22 GS19520.50 HN7240.19 RW23045759.59 SA213965.53 SN107962.79 UW3980.10
22
Future Work Further testing. Implement QC procedures. http://www.atmos.washington.edu/~jbaars/
23
Extra’s
24
networktestn flaggedpercent flagged (n/n_flagged_this_ network) AWpersistence7254498.52 AWrange20.00 AWspatial1610.22 AWstep9291.26 RWpersistence21288592.38 RWrange1340.06 RWspatial155226.74 RWstep19160.83 SApersistence1931090.25 SArange400.19 SAspatial14536.79 SAstep5932.77 HNpersistence7410.22 HNrange7910.91 HNspatial31443.37 HNstep25735.50 ECpersistence32337.34 ECrange161.85 ECspatial38444.39 ECstep14216.42 DTpersistence2443681.17 DTrange548118.21 DTspatial1160.39 DTstep710.24
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.