Download presentation
Presentation is loading. Please wait.
Published byMorgan Hill Modified over 9 years ago
1
2 - 5 December, 20082008 Hurricane Conference Tropical Prediction Center, Miami, FL 2009 Hurricane Conference Authors : Paul Chang, NOAA/NESDIS Jim Carswell, Remote Sensing Solutions Presentation: SFMR Performance Assessment
2
Objective Develop a tool to assess SFMR retrieval performance (post mission and real-time). Tool needed to be self contained: –Use on SFMR measurements and those data provided to SFMR in real-time. Identify error sources (i.e. individual channels). Provide quality assessment of retrieved winds and rain. Potentially provide real-time corrections to address calibration issues. Plausible solution for the Air Force systems as well as NOAA. Note: Even small calibration errors can be significant and will lead to inconsistency between aircraft. Wind speed and rain errors are coupled – must account for both. 1 December, 20092009 NOAA Hurricane Conference Tropical Prediction Center, Miami, FL
3
Inputs and Assumptions Available Information: –Brightness Temperature Measurements. –SFMR Retrievals. –Observed SFMR Model-Measurement Difference (available through retrieval process). –SFMR Ancillary Data (i.e. altitude, attitude, SST). Assumptions: –SFMR Model Function is valid. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
4
Difference between modeled and measured Tb is an assessment of retrieval process accuracy & can provide insight on SFMR performance for each channel. SFMR Retrieval Process SFMR Retrieval Steps: 1) Determine initial / next guess (U10 & R R ). 2) Calculate Tb using SFMR Model Function. 3) Determine difference between measured and modeled Tb. 4) If model and measurements agree within a specified threshold, retrieval complete. If not, determine next guess and repeat. SFMR Retrieval Products: 1)U 10 & R R. 2)Error Calculations. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Simplified Retrieval Block Diagram
5
Histogram of Model-Measurement Tb Difference (Calibration Flight) 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL ATTRIBUTES: 1)Should be zero mean. 2)Insensitive to modeling error. 3)Each channel shown separately. 4)Evaluates fit to both wind speed (mean calibration) and rain rate (relative calibration). 5)Difference by channel already calculated by retrieval process. Small Cal Error -non-zero -increase width
6
Interpretation / Tuning Channel Calibration Error –Peak of difference histogram will be non-zero. –Width of histogram will increase with calibration error. –Large errors in one channel will “leak” into the errors of adjacent channels. Auto-calibration Tuning –Apply small bias correction to re-center peaks of error histograms for each channel to zero offset. –Mean (across channels) bias correction can be set to zero to prevent introducing a mean bias. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
7
Calibration Bias Correction Applied Bias Correction:.1K, -.1K, 0K, 0K All channels have zero mean. Width of error histograms decreased (error from one channel will “leak” into adjacent channels). Peak now near 3000 points (was ~1500). 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
8
ProSensing and AOC Retrievals on NOAA Aircraft are Significantly Different ProSensing retrievals significantly differ. RFI was not present during this flight. RSS retrievals are 1 second, and thus have more noise. RSS winds seem to follow flight winds much better and show lower wind speeds. RSS retrieval process is identical to NOAA except 10 second averaging has not been appied. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
9
ProSensing and AOC Retrievals on NOAA Aircraft are Significantly Different ProSensing retrievals are used to calculate brightness temperature. Difference between modeled brightness temperature and measured is shown. Significantly negative bias present. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
10
Calibration Drift During the Season
11
NOAA Calibration: 20080830 Flight 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
12
Corrected Calibration: 20080830 Flight 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K
13
Other Calibration Issues
14
NOAA Calibration: 20080911 Flight 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K Evidence of bi-modal Large negative biased errors and bi-modal indicates in flight drift or change.
15
Corrected Calibration: 20080911 Flight 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K Could not remove bi- modal behavior. Suspect 5.57 GHz channel was having problems.
16
Retrievals: 20080911 Flight 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K
18
Air Force 20090824 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K SFMR 010 Green: Land Contamination.
19
Air Force 20090901 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Correction: -.3K,-.5K, 0K, 0K SFMR 010
20
Summary Calibration assessment tool has been created that could be deployed as a real-time tool to auto correct calibration and detect instrument drift problems on a channel basis. AOC and Air Force SFMRs show significant calibration errors over the season – these errors change. Current calibration process tunes to a fixed wind speed and neglects rain. Must use a statistical approach to achieve a more robust calibration. ProSensing retrievals differ to the NOAA retrievals. Must be some fundamental differences in the retrieval process. ProSensing retrievals differ from model function. 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL
21
T H A N K Y O U Q U E S T I O N S ?
23
Warning Thresholds Wind Speed Rain Rate Wind Speed BiasRain Rate Bias Correlation minmaxminmax knotsmm/hrknots mm/hr % Tropical Storm Force / Gale 330-12.68.90.04.6-90.7 335-11.67.8-3.72.4-93.8 3310-11.37.7-1.91.7-93.7 3320-11.97.8-1.21.2-93.0 3330-13.58.31.1-92.4 3340-16.09.2-1.11.1-91.8 Storm 500-6.05.90.04.7-89.3 505-5.55.2-3.82.4-94.4 5010-5.64.9-2.01.8-94.2 5020-5.85.0-1.21.3-93.4 5030-6.35.4-1.11.2-92.8 5040-7.26.0-1.11.2-92.4 Hurricane Category 1 640-4.34.60.04.9-89.0 645-4.04.0-4.32.6-94.5 6410-4.13.9-2.11.9-94.3 6420-4.23.9-1.31.4-93.6 6430-4.64.2-1.21.2-93.0 6440-5.24.7-1.21.3-92.6 Hurricane Category 2 830-3.84.40.05.3-88.6 835-3.53.9-4.62.9-94.7 8310-3.53.6-2.42.1-94.6 8320-3.63.8-1.51.6-93.8 8330-4.04.2-1.51.5-97.3 8340-4.54.6-1.41.4-92.9 Effects of Calibration Errors 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Calibration Error Introduced -1 Kelvin error (absolute sum of individual channel error). -1000s of different realizations. Mean Retrieval Error -Simulated Tb values created with different cal error realizations. -Retrieved wind and rain estimated using SFMR retrieval processor. -Mean error calculated over different realizations for each true wind speed and rain rate combination.
24
Warning Thresholds Wind Speed Rain Rate Wind Speed BiasRain Rate Bias Correlation minmaxminmax knotsmm/hrknots mm/hr % Hurricane Category 3 960-3.84.50.05.6-88.6 965-3.74.1-4.83.2-94.8 9610-3.63.8-2.72.3-94.8 9620-3.73.8-1.81.7-94.0 9630-4.14.1-1.51.6-93.5 9640-4.54.6-1.51.6-93.2 Hurricane Category 4 1140-3.94.80.06.2-88.8 1145-3.74.4-4.93.6-95.1 11410-3.73.9-3.32.7-95.1 11420-3.83.8-2.12.0-94.4 11430-4.14.2-1.81.8-93.9 11440-4.74.7-1.81.9-93.5 Hurricane Category 5 1350-4.15.00.07.1-88.8 1355-3.94.7-4.94.4-95.7 13510-3.94.1-4.33.3-95.5 13520-4.04.1-2.62.5-94.9 13530-4.34.4-2.32.2-94.4 13540-4.54.7-1.81.8-92.9 Maximum Wind 1650-4.56.20.09.2-88.7 1655-4.35.8-4.96.2-95.9 16510-4.34.8-8.14.8-96.2 16520-4.44.6-4.23.6-95.9 16530-4.84.9-3.63.3-95.5 16540-5.55.4-3.53.4-95.3 Effects of Calibration Errors 1 December, 20092009 Hurricane Conference Tropical Prediction Center, Miami, FL Calibration Error Introduced -1 Kelvin error (absolute sum of individual channel error). -1000s of different realizations. Mean Retrieval Error -Simulated Tb values created with different cal error realizations. -Retrieved wind and rain estimated using SFMR retrieval processor. -Mean error calculated over different realizations for each true wind speed and rain rate combination.
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.