1 NUOPC National Unified Operational Prediction Capability Update to COPC 27 – 28 May 2015 Dave McCarren, NUOPC DPM.

Slides:



Advertisements
Similar presentations
Slide 1ECMWF forecast products users meeting – Reading, June 2005 Verification of weather parameters Anna Ghelli, ECMWF.
Advertisements

SNPP VIIRS green vegetation fraction products and application in numerical weather prediction Zhangyan Jiang 1,2, Weizhong Zheng 3,4, Junchang Ju 1,2,
The THORPEX Interactive Grand Global Ensemble (TIGGE) Richard Swinbank, Zoltan Toth and Philippe Bougeault, with thanks to the GIFS-TIGGE working group.
Update to COPC 4 – 5 November 2014 Dave McCarren, NUOPC DPM.
A Brief Guide to MDL's SREF Winter Guidance (SWinG) Version 1.0 January 2013.
“Where America’s Climate, Weather and Ocean Services Begin” NCEP CONDUIT UPDATE Brent A Gordon NCEP Central Operations January 31, 2006.
Details for Today: DATE:3 rd February 2005 BY:Mark Cresswell FOLLOWED BY:Assignment 2 briefing Evaluation of Model Performance 69EG3137 – Impacts & Models.
94th American Meteorological Society Annual Meeting
Recent performance statistics for AMPS real-time forecasts Kevin W. Manning – National Center for Atmospheric Research NCAR Earth System Laboratory Mesoscale.
Temperature (ºC) wind field at 200hPa Performance of the HadRM3P model for downscaling of present climate in South American Lincoln Muniz Alves*, José.
MOS Developed by and Run at the NWS Meteorological Development Lab (MDL) Full range of products available at:
The NCEP operational Climate Forecast System : configuration, products, and plan for the future Hua-Lu Pan Environmental Modeling Center NCEP.
Hydrometeorological Prediction Center HPC Medium Range Grid Improvements Mike Schichtel, Chris Bailey, Keith Brill, and David Novak.
National Centers for Environmental Prediction (NCEP) Hydrometeorlogical Prediction Center (HPC) Forecast Operations Branch Winter Weather Desk Dan Petersen.
1 NGGPS Dynamic Core Requirements Workshop NCEP Future Global Model Requirements and Discussion Mark Iredell, Global Modeling and EMC August 4, 2014.
Consolidated Seasonal Rainfall Guidance for Africa, July 2014 Initial Conditions Issued 14 July 2014 Forecast Background – ENSO update – Current State.
MOS Performance MOS significantly improves on the skill of model output. National Weather Service verification statistics have shown a narrowing gap between.
Ensemble Post-Processing and it’s Potential Benefits for the Operational Forecaster Michael Erickson and Brian A. Colle School of Marine and Atmospheric.
A Regression Model for Ensemble Forecasts David Unger Climate Prediction Center.
1 Assessment of the CFSv2 real-time seasonal forecasts for Wanqiu Wang, Mingyue Chen, and Arun Kumar CPC/NCEP/NOAA.
UMAC data callpage 1 of 25North American Ensemble Forecast System - NAEFS EMC Operational Models North American Ensemble Forecast System Yuejian Zhu Ensemble.
Performance of the MOGREPS Regional Ensemble
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Rongqian Yang, Ken Mitchell, Jesse Meng Impact of Different Land Models & Different Initial Land States on CFS Summer and Winter Reforecasts Acknowledgment.
1 Global Ensemble Strategy Presented By: Yuejian Zhu (NWS/NCEP) Contributors: EMC staff.
1 Report of Inclusion of FNMOC Ensemble into NAEFS S. Lord (NCEP/EMC) Andre Methot (MSC) Yuejian Zhu, and Zoltan Toth (NOAA) Acknowledgements Bo Cui (EMC),
Assimilation of GOES Hourly and Meteosat winds in the NCEP Global Forecast System (GFS) Assimilation of GOES Hourly and Meteosat winds in the NCEP Global.
THE CENTRAL WEATHER BUREAU REGIONAL CLIMATE DYNAMICAL DOWNSCALING FORECAST PRODUCTS FOR JFM 2011 HUI-LING WU and CHIH-HUI SHIAO.
Fly - Fight - Win 16 th Weather Squadron Evan Kuchera Fine Scale Models and Ensemble 16WS/WXN Template: 28 Feb 06 Air Force Weather Ensembles.
On Improving GFS Forecast Skills in the Southern Hemisphere: Ideas and Preliminary Results Fanglin Yang Andrew Collard, Russ Treadon, John Derber NCEP-EMC.
A Comparison of the Northern American Regional Reanalysis (NARR) to an Ensemble of Analyses Including CFSR Wesley Ebisuzaki 1, Fedor Mesinger 2, Li Zhang.
Allan Darling Deputy Director, NCEP Central Operations NOAA NWS NCEP
1 NUOPC National Unified Operational Prediction Capability 1 Review Committee for Operational Processing Centers National Unified Operational Prediction.
1 Precipitation verification Precipitation verification is still in a testing stage due to the lack of station observation data in some regions
Shuhei Maeda Climate Prediction Division
Modification of GFS Land Surface Model Parameters to Mitigate the Near- Surface Cold and Wet Bias in the Midwest CONUS: Analysis of Parallel Test Results.
Rongqian Yang, Kenneth Mitchell, Jesse Meng NCEP Environmental Modeling Center (EMC) Summer and Winter Season Reforecast Experiments with the NCEP Coupled.
The climate and climate variability of the wind power resource in the Great Lakes region of the United States Sharon Zhong 1 *, Xiuping Li 1, Xindi Bian.
Verification of Global Ensemble Forecasts Fanglin Yang Yuejian Zhu, Glenn White, John Derber Environmental Modeling Center National Centers for Environmental.
Model Post Processing. Model Output Can Usually Be Improved with Post Processing Can remove systematic bias Can produce probabilistic information from.
Update on Dropout Related COPC Action Items Presented by Dr. Bradley Ballish Co-Chair JAG/ODAA 14 May 2009 COPC Meeting NAVO Stennis Space Center.
. Outline  Evaluation of different model-error schemes in the WRF mesoscale ensemble: stochastic, multi-physics and combinations thereof  Where is.
1 Results from Winter Storm Reconnaissance Program 2008 Yucheng SongIMSG/EMC/NCEP Zoltan TothEMC/NCEP/NWS Sharan MajumdarUniv. of Miami Mark ShirleyNCO/NCEP/NWS.
1 Results from Winter Storm Reconnaissance Program 2007 Yucheng SongIMSG/EMC/NCEP Zoltan TothEMC/NCEP/NWS Sharan MajumdarUniv. of Miami Mark ShirleyNCO/NCEP/NWS.
Update on Dropout Team Work and Related COPC Action Items Presented by Dr. Bradley Ballish Co-Chair JAG/ODAA and Member of Dropout Team 5 May 2010 COPC.
Slides for NUOPC ESPC NAEFS ESMF. A NOAA, Navy, Air Force strategic partnership to improve the Nation’s weather forecast capability Vision – a national.
NCEP Models and Ensembles By Richard H. Grumm National Weather Service State College PA and Robert Hart The Pennsylvania State University.
1 NUOPC National Unified Operational Prediction Capability Update to COPC 4 – 5 Nov 2015 Dave McCarren, NUOPC DPM.
Statistical Post Processing - Using Reforecast to Improve GEFS Forecast Yuejian Zhu Hong Guan and Bo Cui ECM/NCEP/NWS Dec. 3 rd 2013 Acknowledgements:
1 Developing Assimilation Techniques For Atmospheric Motion Vectors Derived via a New Nested Tracking Algorithm Derived for the GOES-R Advanced Baseline.
Sources of Skill and Error in Long Range Columbia River Streamflow Forecasts: A Comparison of the Role of Hydrologic State Variables and Winter Climate.
An Examination Of Interesting Properties Regarding A Physics Ensemble 2012 WRF Users’ Workshop Nick P. Bassill June 28 th, 2012.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Overview of WG5 activities and Conditional Verification Project Adriano Raspanti - WG5 Bucharest, September 2006.
Update to COPC 21 November 2013 Chuck Skupniewicz, FNMOC, UEO co-chair Yuejian Zhu, EMC, UEO co-chair Dave McCarren, NUOPC DPM.
MODIS Winds Assimilation Impact Study with the CMC Operational Forecast System Réal Sarrazin Data Assimilation and Quality Control Canadian Meteorological.
10th COSMO General Meeting, Cracow, Poland Verification of COSMOGR Over Greece 10 th COSMO General Meeting Cracow, Poland.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Probabilistic Forecasts Based on “Reforecasts” Tom Hamill and Jeff Whitaker and
Update on Dropout Team Work and Related COPC Action Items Bradley Ballish NOAA/NWS/NCEP/PMB Co-Chair JAG/ODAA April 2009 CSAB Meeting.
Verification methods - towards a user oriented verification The verification group.
Xiujuan Su 1, John Derber 2, Jaime Daniel 3,Andrew Collard 1 1: IMSG, 2: EMC/NWS/NOAA, 3.NESDIS Assimilation of GOES hourly shortwave and visible AMVs.
1 A review of CFS forecast skill for Wanqiu Wang, Arun Kumar and Yan Xue CPC/NCEP/NOAA.
Verifying and interpreting ensemble products
Tom Hopson, Jason Knievel, Yubao Liu, Gregory Roux, Wanli Wu
Post Processing.
Lidia Cucurull, NCEP/JCSDA
The Importance of Reforecasts at CPC
Ensemble-4DWX update: focus on calibration and verification
Environment Canada Monthly and Seasonal Forecasting Systems
Presentation transcript:

1 NUOPC National Unified Operational Prediction Capability Update to COPC 27 – 28 May 2015 Dave McCarren, NUOPC DPM

2 NUOPC National Unified Operational Prediction Capability 2 Agenda  National ESPC Update  NUOPC UEO Committee Update  NUOPC CMA Committee Update  Questions and Discussion

3 NUOPC National Unified Operational Prediction Capability. The National Earth System Prediction Capability (National ESPC) 3

The National Earth System Prediction Capability ESPC 4 National ESPC Update National ESPC Strategy 2-pager created for appropriate distribution to get support for the project; sent to Liaisons for review and taken to AMS meeting in Phoenix National ESPC staff drafting Strategy paper for submission to BAMS ESG meeting 26 Jan 2015 ESG Principals +1 meeting hosted by Navy 5 May 2015

The National Earth System Prediction Capability ESPC 5 National ESPC Update ESG meeting 26 Jan 2015 Action Items: Modify the proposed National ESPC Management Structure to show reporting relationships and communications. Project office will provide proposal with fully coordinated recommendation, options, with pros and cons. Coordinate a topical ESG meeting within 90 days for various topics, N2/6E to host. Semi-closed session: just staff and Principals + 1. Verify that the ESMF paper contains proper mention of National ESPC. Provide the ESG principals background on ESMF management, funding, and requirements process. Coordinate specific sessions for National ESPC at upcoming conferences. Coordinate a response to the Presidential Executive Order on Coordination of National Efforts in the Arctic on the role of ESPC in Arctic prediction, including a potential National ESPC brief by the principals to OSTP. This response may include the possible development of an integrated NAVY/NOAA operational Arctic Modeling Strategy.

The National Earth System Prediction Capability ESPC 6 National ESPC Update ESG meeting 5 May 2015 draft Action Items: Revise the ESPC and NUOPC charters and create a new National ESPC Charter. Include recommendations from Decision Brief on Management Structure/Plan, and Personnel Issues. Work with climate community to help define what is operational for products to support strategic decision making, Define components that will be part of National ESPC - spiral 1 Coordinate AF and Navy collaboration to determine the DOD way ahead and converge on a common core

National Unified Operational Prediction Capability NUOPC 7 UEO Committee Update

National Unified Operational Prediction Capability NUOPC 8 Unified Ensemble Operations NUOPC metrics from NCEP, FNMOC, and AFWA briefed at 26 Jan ESG meeting ½ Degree Data Exchange Upgrade on schedule for Summer 2015 implementation at NCEP, CMC, and FNMOC NUOPC/NAEFS mini workshop on Feb –Face to face meeting of the Co-Chairs of the UEO Committee –Agenda focused on the development of NUOPC that includes data exchange (of global and wave ensemble data, post process, ensemble week 3&4 forecast and future plans. –Research following a concern by FNMOC over the timing of the CMC raw ensemble data resulted in NCEP starting raw data processing 90 minutes earlier, much closer to real time and bias corrected data. AF moved up operational GEPS processing to take advantage of this; GEPS suite had 5,000 unique users and about 4M product hits in the last 12 months

½ Degree Data Exchange Implementation Schedule NCEPCMCFNMOC Test hi-res dataAvailable Pre-production hi-res dataJan 2015Sep-Oct 2014 Production hi-res dataApr 2015Nov-Dec 2014Oct 2014 Ready to receive hi-res production dataJan 2015 Test hi-res NAEFS productsMar 2015Apr 2015 (SFC-3HR) Pre-production hi-res NAEFS productsApr 2015 Production hi-res NAEFS productsMay 2015Jul 2015 (SFC-3HR) Earliest practical date for each stage at each center

5-day forecast for surface wind (U) 10-day forecast for surface wind (U)

NH T2m

NA T2m

Northern Hemisphere Wind Speed 13 The spread ratio for wind speed was greater than one for the entire period (under-dispersive) with NAVGEM and GFS members both being over-dispersive and GEM members also under-dispersive. There was a spike in May when NOGAPS was replaced by NAVGEM. GEM members are under-dispersive while NAVGEM and GFS members are over-dispersive

Ensemble Mean Northern Hemi -2 Meter Temps 14 Ensemble mean in solid red. GFS members in dotted green, NAVGEM members in dotted blue, and GEM members in dotted purple. The solid green is the GFS control member and the solid purple is the CMC control member. We do not currently receive a NAVGEM control member. Ensemble mean has nearly no bias. GFS members have a cold bias, and NAVGEM members have a warm bias

National Unified Operational Prediction Capability NUOPC 15 CMA Committee Update

National Unified Operational Prediction Capability NUOPC NUOPC Common Model Architecture Draft manuscript for ESPS paper submitted to BAMS; CMA Committee revising first draft for 1 June deadline CMA Committee drafting new whitepaper on Using ESPS PI Group building a working Physics Driver prototype with a common physics interface to GFS physics, by June 2015, to address NGGPS requirement to begin testing how it works and use in the selection of a new Dynamc Core, using a common set of physics. PI Group presented modernized Kalnay Rules paper at AMS

National Unified Operational Prediction Capability NUOPC 17 Questions & Discussion

NUOPC Verification Metrics EMC/NCEP May Acknowledgement: Dr. Yan Luo For all three individual bias corrected ensemble forecast (NCEP/GEFS, CMC/GEFS and FNMOC/GEFS) and combined (NUOPC) ensemble (equal weights) against UKMet analysis Period: April 1 st 2011 – May

Ratio of RMS error over spread Northern Hemisphere 500hPa height: 30-day running mean scores of day-5 CRP score RMS error and ratio of RMS error / spread Anomaly correlation All other regions could be seen from: aefs/VRFY_STATS/NUOPC_bc_COMB_spr201 5_ts.html aefs/VRFY_STATS/NUOPC_bc_COMB_spr201 5_ts.html NH 500hPa anomaly correlation 5-day forecast NH 500hPa CRP scores Under-dispersion Over-dispersion NH 500hPa RMS errors

Northern Hemisphere 500hPa height: 30-day running mean scores of day-10 CRP score RMS error and ratio of RMS error / spread Anomaly correlation All other regions could be seen from: fs/VRFY_STATS/NUOPC_bc_COMB_spr2015_ts. html fs/VRFY_STATS/NUOPC_bc_COMB_spr2015_ts. html 10-day forecast NH 500hPa CRP scores NH 500hPa anomaly correlation NH 500hPa RMS errors

NH CRP scores NH RMS errors NA CRP scores NA RMS errors 5-day forecast for surface temperature

10-day forecast for surface temperature

5-day forecast for surface wind (U) 10-day forecast for surface wind (U)

5-day forecast for surface wind (V) 10-day forecast for surface wind (V)

Last winter statistic scores Dec. 1 st 2014 – Feb. 28 th 2015

Northern Hemisphere 500hPa height: Latest 3-month winter scores: CRPS skill score RMS error and ratio of RMS error / spread Anomaly correlation All other regions/scores could be seen from: UOPC/NUOPC_bc_win1415.html UOPC/NUOPC_bc_win1415.html NH CRPS skill scores NH anomaly correlation 500hPa Height NH RMS errors

NH CRPS skill scoresNH RMS errors NA CRPS skill scoresNA RMS errors Surface temperature

NH CRPS skill scores NH RMS/Spread scores For surface wind (U) NH CRPS skill scores NH RMS/Spread scores For surface wind (V)

Comparing of NCEP/GEFS, NAEFS and NUOPC for all bias corrected ensemble forecasts, against UKMet analysis Period: April 1 st 2011 – May 15 th 015

Day-5 NH 500hPa height Day-10 NH 500hPa height

NH T2m

NA T2m

Aim High…Fly, Fight, Win NUOPC Ensemble Performance 1 Jan 2015 – 1 Apr 2015 Mr. Bob Craig 16 WS/WXN

Aim High…Fly, Fight, Win Overview Method Brier Skill Scores Spread Ensemble Mean 34

Aim High…Fly, Fight, Win Method All models verified against 25km UKMO analysis except for total cloud cover that uses WWMCA Data shown is mainly for the northern hemisphere domain containing Navy - NAVGEM CMC - GEM NCEP - GFS On ensemble mean slides, ALL refers to the GEPS ensemble For skill scores, climatology was used as reference forecast 35

Aim High…Fly, Fight, Win Brier Skill Score Brier Score (BS) for ensembles is equivalent to Root Mean Square error for deterministic models. Model probabilities for an event are compared to the relative frequency of the event To get the skill score, the BS is compared to climatology Brier skill scores greater than 0 indicate skill 36

Aim High…Fly, Fight, Win Brier Skill Score Total Cloud > 80% 37 Total Cloud > 80% - looking at the CONUS domain, significant skill is indicated out to about 108hrs. For global and hemisphere domains, little significant skill is indicated. Some reasons for this can be seen on the next slide.

Aim High…Fly, Fight, Win Typical Total Cloud Forecast Total Cloud > 80% 38 Black squares are non- zero probabilities and green “1” are observation hits. Over high latitudes there tends to be less hits compared to lower latitudes. Cloud coverage tends to be over- forecasted

Aim High…Fly, Fight, Win Brier Skill Score Wind Speed > 25kts 39 For wind speeds >= 25kts, global coverage and northern hemisphere have significant skill out to 240 hours. Higher winds speeds (>35, and >50) also have skill, but an insignificant number of events resulted in much larger error bars. 25Kts

Aim High…Fly, Fight, Win Ensemble Spread Spread measures the difference between the ensemble mean and ensemble forecasts Typically the RMSE of the ensemble mean is compared to the standard deviation of the members from the mean (RMSE/Stdev) An ideal ensemble will have the same size of ensemble member deviation from the mean as Root Mean Square error of the ensemble mean An ensemble spread greater than one means the ensemble does not contain enough spread in the members to account for the errors in the ensemble system 40

Aim High…Fly, Fight, Win Northern Hemisphere 500mb Heights 41 This chart depicts the ensemble spread ratio of the ensemble members compared to the ensemble mean over the last year for the 120 hour forecast. Green represents the spread ratio of the combined ensemble (ALL) compared to the spread for GFS (purple), Navy (red), and CMC (blue). Ideally, the spread ratio should be around one. A spread ratio greater than one indicates the ensemble is under-dispersive and the opposite is true for spread ratio less than one. This chart depicts the ensemble spread by forecast hour. The ensemble starts off under-dispersive in early forecast hour getting better in later forecasts, green line approaching one

Aim High…Fly, Fight, Win Northern Hemisphere Wind Speed 42 The spread ratio for wind speed was greater than one for the entire period (under-dispersive) with NAVGEM and GFS members both being over-dispersive and GEM members also under-dispersive. There was a spike in May when NOGAPS was replaced by NAVGEM. GEM members are under-dispersive while NAVGEM and GFS members are over-dispersive

Aim High…Fly, Fight, Win Northern Hemisphere Temperature 43 The spread ratio was above 1 (under_dispersive) for most of the winter but approaching 1 as Spring arrives Over the last several months, the spread starts under-dispersive with early forecast hours but becomes neutral by the end of the period

Aim High…Fly, Fight, Win Ensemble Mean Ensemble mean is the average value of the ensemble members The ensemble mean error should be less than the error of all the ensemble members for a good ensemble One of the ensemble members from each center is the control and is configured like the deterministic model from the center though its resolution is degraded We do not currently get a control member from NAVGEM, but that will be added when available 44

Aim High…Fly, Fight, Win Ensemble Mean Northern Hemi -2 Meter Temps 45 Ensemble mean in solid red. GFS members in dotted green, NAVGEM members in dotted blue, and GEM members in dotted purple. The solid green is the GFS control member and the solid purple is the CMC control member. We do not currently receive a NAVGEM control member. Ensemble mean has nearly no bias. GFS members have a cold bias, and NAVGEM members have a warm bias

Aim High…Fly, Fight, Win Ensemble Mean Northern Hemi -10 Meter Winds 46 GFS members have the lowest error and NAVGEM members have the highest The ensemble mean has a positive bias and the NAVGEM members seem to be pulling it that way

Aim High…Fly, Fight, Win Ensemble Mean Northern Hemi – 500mb Hgts 47 NAVGEM members have the highest error out to 168 hrs, then all members are similar Ensemble mean bias is near 0 out to about 168 hours then starts to rise as GEM and NVGEM members pull it positive

Aim High…Fly, Fight, Win Ensemble Mean Northern Hemi – 250mb Winds 48 A three model members are close for this field Ensemble mean has a negative bias with all model members contributing. GFS members show a discontinuity after 192 hrs in their bias

Aim High…Fly, Fight, Win Ensemble Mean Northern Hemi – 850mb Temps 49 GFS members slightly lower error in early forecast hours Ensemble mean has a cold bias with GEM and GFS members pulling it into down

Aim High…Fly, Fight, Win Summary The GEPS ensemble system is performing as expected Brier skill scores for wind speed indicated the GEPS ensemble has skill out to 240hrs Precipitation forecasts weren’t included here due to recent problem with UM GRIB files The GEPS ensemble spread score tends to be greater than 1 with the mean RMSE > member standard deviation, indicating the ensemble members are under- dispersed which is a common ensemble issue 50

Aim High…Fly, Fight, Win Summary Ensemble mean forecasts have lower errors than the members which is expected for correctly designed ensembles. 51