Download presentation
Presentation is loading. Please wait.
Published byVirgil Jones Modified over 9 years ago
1
8 – 9 September 2010CICS Science Meeting – College Park, MD Validation of the AMSU Snowfall Algorithm (over the US) Ralph Ferraro Matt Sapiano, Meredith Nichols, Huan Meng National Oceanic and Atmospheric Administration (NOAA) National Environmental Satellite, Data & Information Service (NESDIS) AND The Cooperative Institute for Climate and Satellites (CICS) College Park, Maryland
2
8 – 9 September 2010CICS Science Meeting – College Park, MD Objectives and Science Questions NOAA has an operational precipitation product from AMSU –Includes a falling snow identification flag over land Kongoli et al., 2003, GRL –Snowfall rates being developed by H. Meng We know that it works in some cases and not in others –How do we quantify the accuracy of the detection algorithm? Work was done in the algorithm development… –Under what meteorological conditions does it work and not work? This is what we are really after! Answer is crucial as we enter into the GPM era –Snowfall is important component of hydrological cycle –In some places, snowfall is the primary form of precipitation What I plan on showing –Several attempts to validate –What we are hoping to accomplish (work in progress)
3
8 – 9 September 2010CICS Science Meeting – College Park, MD East Coast Snow/Ice Storm – 14 February 2007 NOAA-16 Precipitation Type/Rainfall Rate NOAA-16 Snowfall RateNEXRAD Reflectivity Corresponds with > 20 dBZ Underestimates in heavy snow
4
8 – 9 September 2010CICS Science Meeting – College Park, MD
5
8 – 9 September 2010CICS Science Meeting – College Park, MD February 5-6, 2010 Snow Event Courtesy H. Meng SFR (mm/hr)
6
8 – 9 September 2010CICS Science Meeting – College Park, MD February 5-6, 2010 Snow Event Courtesy H. Meng
7
8 – 9 September 2010CICS Science Meeting – College Park, MD Verification Issues Hourly surface reports of snowfall are widely varying –“S-” can mean just about anything Visibility, T-DP spread (RH) are better indicators of intensity –Hourly water equivalent are scarce and unreliable ASOS, wind, etc. Radar does not make rain/snow distinction w/o human interpretation –Virga, surface temperature are issues Wide variety of conditions within MW satellite FOV Previous work we have done show “lag” between surface and satellite signal –Snow fall slower than rain Others
8
8 – 9 September 2010CICS Science Meeting – College Park, MD First Attempt – Climatology We generated AMSU snowfall “climatology” –7 years, 5 deg grids –NOAA-15 and -16 Some assessments –Heaviest occurrences in “transition zones” But values seem low –Large areas where retrievals don’t occur Too cold and dry –Other features E. Canada Pacific NW/AK Rocky Mountains Himalayas SON DJF MAM
9
8 – 9 September 2010CICS Science Meeting – College Park, MD Comparison with Snow Cover AMSU - Jan 2006 Rutgers Snow Lab – Jan 2006
10
8 – 9 September 2010CICS Science Meeting – College Park, MD Verification A. Dai (NCAR) –J. Climate, 2001; COADS climatology; 15,000 surface station reports –Can stratify by WMO weather codes Grouping by all snow reports –Huge discrepancies. Why? SW-, non accumulating snow Filtered by S/SW, SW+/S+, temp/visiblity info from Rasmussen & Cole (2002) –Better qualitative agreement –Still not apples to apples comparison AMSU 4 times/day; COADS, 24 times/day Does imply that AMSU has skill in these type of events Some recent work by Liu with CloudSat –Frequency of snow values comparable to these filtered data
11
8 – 9 September 2010CICS Science Meeting – College Park, MD AMSU (L) vs. COADS (R) DJF SON
12
8 – 9 September 2010CICS Science Meeting – College Park, MD Second Attempt – Daily Snow Reports Are there denser surface networks that can be used? Is there a better way to validate the ‘spatial’ patterns of the AMSU –Storm cases indicate ‘skill’; how best to quantify? WMO/NCDC – “Global Surface Summary of the Day Data V7” –9000 stations –Gives weather occurrences, max/min temp., precip. totals Effort led by Matt Sapiano (now at CIRA) –2000-08, N15, N16, N18 data
13
8 – 9 September 2010CICS Science Meeting – College Park, MD Nine Years of Comparisons High resolution data still fairly sparse –1 or 2.5 deg better for comparison GSOD > AMSU just about everywhere –Probably due to very light snow Let’s look closer…
14
8 – 9 September 2010CICS Science Meeting – College Park, MD Example – GSOD vs. AMSU Extrapolate GOSD on 1 deg grid Color coding –Green (Hit) –Blue (False on AMSU) –Red (Miss) Could be rain –Gray (Hit – no snow) Qualitative assessment –Overrunning snow – good –Upper low/backside snow Missed Attempted quantitative assessment at different locations –M. Nichols, HS student/intern –Different locations/regimes Results inconclusive…time coincidence a limitation
15
8 – 9 September 2010CICS Science Meeting – College Park, MD Current Attempt – Hourly Snow Reports Although we tried to avoid this, this is really the only way to go… H. Mengs effort, HUGE data base of hourly synoptic reports colocated with AMSU for several years and all satellites. Stratified data by location, weather conditions, precipitation totals, etc. –Still evaluating data….
16
8 – 9 September 2010CICS Science Meeting – College Park, MD Some Results – Jan 2007 SatellitePOD (1 mm/hr or greater) POD (SN or SN+) MOA0.750.70 N150.400.25 N160.520.42 N180.640.58
17
8 – 9 September 2010CICS Science Meeting – College Park, MD Summary Validation of AMSU snowfall algorithm is a difficult task Algorithm has known limitations; we are trying to couple physical phenomenon to this –Temp./Moisture profiles, surface conditions, precip. Intensity, etc. We know that the algorithm has ‘skill’, illustrating this has been a challenge. Why? Incompatibility between satellite and ground data –More severe than in rainfall Ground data fairly scarce, quality in question Current method that should help answer is direct matches between satellite and surface reports Emerging work with CloudSat (and GV) should also be pursued
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.