Presentation is loading. Please wait.

Presentation is loading. Please wait.

IFPS Science Steering Team Forum November 8 th, 2005.

Similar presentations


Presentation on theme: "IFPS Science Steering Team Forum November 8 th, 2005."— Presentation transcript:

1 IFPS Science Steering Team Forum November 8 th, 2005

2 ISST Membership Greg Mann – WFO Detroit (CR) – Lead Greg Mann – WFO Detroit (CR) – Lead Jim Nelson – WFO Anchorage (AR) – Backup Lead Jim Nelson – WFO Anchorage (AR) – Backup Lead Brad Colman – WFO Seattle (WR) Brad Colman – WFO Seattle (WR) Tom Salem – WFO Glasgow (WR) Tom Salem – WFO Glasgow (WR) Bill Ward – Pacific Region Headquarters Bill Ward – Pacific Region Headquarters Steve Keighton – WFO Blacksburg (ER) Steve Keighton – WFO Blacksburg (ER) Dan St. Jean – WFO Grey (ER) Dan St. Jean – WFO Grey (ER) Jeffrey Medlin – WFO Mobile (SR) Jeffrey Medlin – WFO Mobile (SR) Ken Falk – WFO Shreveport (SR) Ken Falk – WFO Shreveport (SR) Lee Anderson – NWSHQ / OS&T – Facilitator Lee Anderson – NWSHQ / OS&T – Facilitator

3 Overview ISST Roadmap ISST Roadmap RTMA / AoR RTMA / AoR Gridded MOS Gridded MOS Day 4-7 assessment Day 4-7 assessment Digital Forecast Process Digital Forecast Process Open Discussion Open Discussion

4 ISST Roadmap

5 Roadmap In an effort to maintain continuity and productivity, the team has devised a roadmap of team activities that have been categorized in the following manner: In an effort to maintain continuity and productivity, the team has devised a roadmap of team activities that have been categorized in the following manner: –Immediate –Influencing –Monitoring –Visionary –Supporting –Ongoing

6 Immediate Days 4-7 assessment Days 4-7 assessment –Given the wide variety of sources available for use in the generation of extended forecast grids, a need for a comprehensive assessment of guidance performance is required to support decision makers. The ISST will encourage and support a formal WFO/HPC/EMC Days 4-7 grid assessment through the development of a suite of objective metrics for the various sources of guidance/grid initializations RTMA field assessment RTMA field assessment –There are several overarching objectives to this assessment: To expose forecasters to real-time objective NDFD-matching gridded analyses. To facilitate early development of WFO tools and approaches that will utilize the mature RTMA/AOR in the preparation and critique of forecasts. Provide subjective field input to EMC to identify weaknesses and biases that are undoubtedly present in the early baseline versions of the RTMA. Digital Forecast Process Digital Forecast Process –Provide both broad ideas and specific input on issues related to improving the scientific validity of methods related to the entire digital forecast process. This item is a larger umbrella under which many other ISST roadmap items fall, and is at the heart of our charter. The more immediate goal of this item is to incorporate field input and ISST opinion on key questions aimed at better defining the digital forecast process the NWS should be aiming for, and to produce a white paper summarizing these ideas and offering suggestions on how to get there. The longer term goal would be to push for and support specific activities that would help the NWS achieve the desired DFP state, and periodically re-visit what this should look like as resources and needs change.

7 Immediate Gridded MOS field assessment Gridded MOS field assessment –To assess whether gridded MOS fields are beneficial to forecast operations. The assessment would also look at how the forecasters used the gridded MOS fields. GFE system enhancements (real-time feedback, integrity, QC) GFE system enhancements (real-time feedback, integrity, QC) –Continue to support ESRL/GSD (FSL) efforts to develop the GFE infrastructure to create enhancements that ultimately ease the forecaster workload and create better scientifically sound products. Centralized bias correction and smart initialization of grids Centralized bias correction and smart initialization of grids –Influence incorporation of an effective Bias Correction scheme and to enact smart initialization at a central location. This will reduce the overhead related to ifpInit in the field offices and ensure a consistent application of smart initialization across the entire domain.

8 Influencing OCONUS AoR OCONUS AoR –Monitor CONUS AoR actions and develop an OCONUS requirements White Paper to implement an AoR in the OCONUS. Short Term Forecast Methods/Tools Short Term Forecast Methods/Tools –Develop a loose framework of methods and tools resulting in more efficient and accurate short term gridded forecasts

9 Monitoring NWS Concept of Operations changes (impacts on DFP) NWS Concept of Operations changes (impacts on DFP) –Facilitate the discussion of evolving the gridded forecast paradigm in the proposed restructured NWS Concept of Operations

10 Visionary Probabilistic/uncertainty (including tropical fields) Probabilistic/uncertainty (including tropical fields) –To transition the NDFD and digital services from a highly deterministic set of forecasts to one that is has a better balance of probabilistic and uncertainty information with the highly detailed geoclimatic information. Future elements and required support guidance Future elements and required support guidance –To recommend and encourage the creation of new grids (either forecast or guidance) that will make the forecast process more efficient, accurate, and scientifically correct. Also, to increase the amount of information produced in the gridded database to be useful for all partners and the public. LDFD / RDFD / NDFD LDFD / RDFD / NDFD –Explore the advantages of a multi-tiered digital forecast database within the frame work of a scientifically sound digital forecast process, and also considering what makes most sense for both the users, from the national to the local level. Explore answers to the following kinds of questions: 1) What would be the differences between a local, regional, and national database in terms of resolution, elements, forecast projections, update frequency, products generated from them, etc? 2) What entities within the NWS would be responsible for contributing to each of them? 3) How would they be interconnected? When answering some of these questions, we have to consider how these multiple tiers can most efficiently be produced within the current structure of the NWS, as well as potential future structure (therefore, this is tied in heavily with “CONOPS Changes”, “DFP”, and several other ISST roadmap issues). Role of local models Role of local models –Identify and explore the utility of local models in the augmentation and enhancement of the gridded forecast process.

11 Supporting Input on Digital Services Directives Transition (10-23xx) Input on Digital Services Directives Transition (10-23xx) –Ensure scientific integrity for each weather element in 10-23xx Directive series Gridded MOS Gridded MOS –Support the efforts of MDL and ESRL/GSD in getting gridded MOS into GFE. Gridded Verification Gridded Verification –To encourage the creation of gridded verification to allow forecasters to assess their skill at forecasting in the gridded world. Also, to encourage the forecasters to look at there verification scores on a grid and improve on their forecasts. CONUS AoR CONUS AoR –Support development and execution of RTMA/AoR, including strong advocacy position. Smart Tool/Init Team Smart Tool/Init Team –To support the role of the ST/SIT through direct correspondence and quarterly updates from the Team Chair. ESRL/GSD & MDL projects (as requested) ESRL/GSD & MDL projects (as requested) –Act as sounding board/testbed/advisory committee to ESRL/GSD & MDL on an as- needed, as-requested basis Climatology Climatology –Maintain and Improve Current Climatology Grids for GFE

12 Ongoing Input on directives Input on directives Periodic review of roadmap (Quarterly) Periodic review of roadmap (Quarterly) Training Training

13 Delivery Vehicles Forums Forums Briefings (Corp Board S&T Sub-Committee, DS-PAC, Regions, MICs, SOOs, etc) Briefings (Corp Board S&T Sub-Committee, DS-PAC, Regions, MICs, SOOs, etc) Input to DS-PAC Input to DS-PAC Field surveys Field surveys White papers White papers One-pagers or memos One-pagers or memos Form new teams Form new teams Ex-officio membership on other teams Ex-officio membership on other teams Dialogue with OS&T Director, regions, SOOs, etc Dialogue with OS&T Director, regions, SOOs, etc Input on training requirements Input on training requirements

14 RTMA / AOR

15 The RTMA/AOR Team  NWS AOR IWT  Lee Anderson (co-chair), OST  Brad Colman (co-chair)  Fred Branski, OCIO  Geoff DiMego, NCEP EMC  Brian Gockel, OST MDL  Dave Kitzmiller, OHD  Chuck Kluepfel, OCWWS  Art Thomas, OCWWS  Bill Ward, NWS PR  Al Wissman, OOS  NCEP/EMC  Geoff DiMego  Ying Lin  Manuel Pondeca  John Horel, University of Utah  Bob Aune, NESDIS  Other partners and supporters:  NCDC, NESDIS  OAR, FSL, OST SEC  AMS

16 Robert Aune, NOAA/NESDIS University of Wisconsin Space Sciences and Engineering Center Robert Aune, NOAA/NESDIS University of Wisconsin Space Sciences and Engineering Center Stanley Benjamin, Forecast Systems Laboratory Stanley Benjamin, Forecast Systems Laboratory Craig Bishop, Naval Research Laboratory Craig Bishop, Naval Research Laboratory Keith A. Brewster, Center for Analysis and Prediction of Storms The University of Oklahoma Keith A. Brewster, Center for Analysis and Prediction of Storms The University of Oklahoma Brad Colman (Committee Co-chair), NOAA/National Weather Service -- Seattle Brad Colman (Committee Co-chair), NOAA/National Weather Service -- Seattle Christopher Daly, Spatial Climate Analysis Climate Service Oregon State University Christopher Daly, Spatial Climate Analysis Climate Service Oregon State University Geoff DiMego, NOAA/ National Weather Service National Centers for Environmental Prediction Geoff DiMego, NOAA/ National Weather Service National Centers for Environmental Prediction Joshua P. Hacker, National Center for Atmospheric Research Joshua P. Hacker, National Center for Atmospheric Research John Horel (Committee Co-chair), Department of Meteorology, University of Utah John Horel (Committee Co-chair), Department of Meteorology, University of Utah Dongsoo Kim, National Climatic Data Center Dongsoo Kim, National Climatic Data Center Steven Koch, Forecast Systems Laboratory Steven Koch, Forecast Systems Laboratory Steven Lazarus, Florida Institute of Technology Steven Lazarus, Florida Institute of Technology Jennifer Mahoney, Aviation Division Forecast Systems Laboratory Jennifer Mahoney, Aviation Division Forecast Systems Laboratory Tim Owen, National Climatic Data Center Tim Owen, National Climatic Data Center John Roads, Scripps Institution of Oceanography John Roads, Scripps Institution of Oceanography David Sharp, NOAA/National Weather Service -- Melbourne David Sharp, NOAA/National Weather Service -- Melbourne Ex Officio: Andy Edman, Science & Technology Committee representative Andy Edman, Science & Technology Committee representative LeRoy Spayd, Meteorological Services Division representative LeRoy Spayd, Meteorological Services Division representative Gary Carter, Office of Hydrology representative Gary Carter, Office of Hydrology representative Kenneth Crawford, COOP/ISOS representative Kenneth Crawford, COOP/ISOS representative Mesoscale Analysis Committee (MAC)

17 Goal: A comprehensive set of the best possible analyses of the atmosphere at high spatial and temporal resolution with particular attention placed on weather and climate conditions near the surface

18 Brief Background and Motivation  WR SOO/DOH IFPS White Paper recommendations :  Develop a national real-time, gridded verification system  Produce objective, bias-corrected model grids for WFO use  2003: S&T Committee endorsed concept of NDFD-grid matching analyses of forecast parameters  June 2004: Community summit to assess requirements and capabilities  August 2004: OST Director established MAC (Mesoscale Analysis Committee)  October 2004: MAC recommended NOAA develop and implement suite of consistent sensible weather analysis products using current and future technologies:  Develop a strategy for prototype AOR or proof-of-concept (the RTMA -- Real Time Mesoscale Analysis)  Provide mesoscale analyses hourly and within 30 minutes of valid time  Pursue an archive-quality analysis (and complete historical re-analysis)

19 Why does the ISST see this as a critical effort? WR SOO/DOH IFPS White Paper recommendations: WR SOO/DOH IFPS White Paper recommendations: –Develop a national real-time, gridded verification system –Provide full-resolution NCEP model grids –Produce objective, bias-corrected model grids for WFO use –Implement methods to objectively downscale forecast grids –Incorporate climatology grids into the GFE process –Deliver short and medium-range ensemble grids –Produce NDFD-matching gridded MOS –Modify the GFE software to ingest real-time data –Optimize ways to tap forecaster expertise Real-time, and delayed mode, analyses will be used to: Real-time, and delayed mode, analyses will be used to: –Verify NWS gridded forecasts –Support model post processing for bias removal –Support forecast guidance development efforts (gridded MOS, etc.) –Enhance forecasters’ situational awareness –Serve as a logical starting point for short-term gridded forecasts –Develop a more robust, climatological database for various studies and applications

20 Program will be executed in three phases Phase I – Real-time Mesoscale Analysis (RTMA) Phase I – Real-time Mesoscale Analysis (RTMA) –Hourly within 30 minutes –Prototype, or proof-of-concept, for AOR  NCEP, FSL, and NESDIS volunteer to build first phase  Matures into quality real-time analysis component  Phase II – Analysis of Record –State-of-the-science analysis (best possible) –Delayed for late arriving data assets –Methodology to be determined (likely community effort) –Accepted ‘truth’ for use in studies and verification Phase III – Reanalysis Phase III – Reanalysis –Apply mature AOR methodology retrospectively –30 year time history of AORs

21 RTMA logistics and timeline Experimental grids are now complete Experimental grids are now complete –Hourly, 5-km NDFD grid, GRIB2 EMC objective evaluation and comparison EMC objective evaluation and comparison Field assessment early CY2006 Field assessment early CY2006 Operational at NCEP Q3 FY2006 Operational at NCEP Q3 FY2006 Distribution of analyses and estimate of analysis error/uncertainty via AWIPS SBN as part of OB7 upgrade – end of FY2006 Distribution of analyses and estimate of analysis error/uncertainty via AWIPS SBN as part of OB7 upgrade – end of FY2006 Archived at NCDC Archived at NCDC

22 Initial RTMA products  Products transmitted hourly via SBN to AWIPS for field offices  Temperature (2 m)  Dew point temperature (2 m)  Wind (10 m, direction and speed)  Quantitative precipitation (Stage II)  Sky cover  Also includes estimates of analysis error  Spatially and temporally varying  Reflects observation density, observation quality and background quality  Also reflects representativeness of observations  Additional products (e.g., max. temp.) developed and provided later  RTMA information archived at NCDC

23 RTMA Methodology Temperature, dew point, and wind elements Temperature, dew point, and wind elements –RUC forecast/analysis (13 km) is downscaled by FSL to 5 km NDFD grid –Downscaled RUC then used as first-guess in NCEP’s 2DVar analysis of ALL surface observations –Estimate of analysis error/uncertainty Precipitation – NCEP Stage II analysis Precipitation – NCEP Stage II analysis Sky cover – NESDIS GOES sounder effective cloud amount Sky cover – NESDIS GOES sounder effective cloud amount

24 Downscaled 2 m Temperature Original 13 kmDownscaled 5 km

25 ALL Surface Obs = 89126 total NCEP obtains full complement of observations

26 NCEP GSI-2DVAR Anisotropic covariance functions

27 Wind-following auto-correlation function for moisture for a test point located at (x=140,y=180)

28 Very first examples!

29 NCEP RTMA Precipitation Analysis NCEP Stage II (real-time) and Stage IV (delayed) precipitation analyses are produced on the 4-km Hydrologic Rainfall Analysis Project grid NCEP Stage II (real-time) and Stage IV (delayed) precipitation analyses are produced on the 4-km Hydrologic Rainfall Analysis Project grid The existing multi-sensor (gauge and radar) Stage II precipitation analysis available 35 minutes past the hour The existing multi-sensor (gauge and radar) Stage II precipitation analysis available 35 minutes past the hour RTMA is mapped to the 5 km NDFD grid and converted to GRIB2 RTMA is mapped to the 5 km NDFD grid and converted to GRIB2 ORIGINALNDFD GRIB2

30 Hourly Gages Available for Stage II Precipitation Analysis

31 Derived ECA from GOES-12 ECA from GRIB2 file – 5km grid GOES-12 IR image (11um) Effective Cloud Amount (ECA, %) Derived from GOES sounder Mapped onto 5-km NDFD grid Converted to GRIB2 for NDGD Robert Aune, NESDIS (Madison, WI) Sky Cover: Effective Cloud Amount

32 Evaluation plans (Nov-Jun) NCEP will establish webpage displaying RTMA NCEP will establish webpage displaying RTMA NCEP will attempt to acquire gridded forms of: NCEP will attempt to acquire gridded forms of: –ADAS from University of Utah / WR –STMAS from FSL Steve Koch –MatchOb from AWIPS NCEP will conduct cross validation analysis NCEP will conduct cross validation analysis Steve Lazarus (ADAS expert) will help with inter- comparison of 2DVar with other analyses Steve Lazarus (ADAS expert) will help with inter- comparison of 2DVar with other analyses NCEP and ISST will conduct field assessment NCEP and ISST will conduct field assessment

33 RTMA Field Assessment ISST will work with EMC and Kirby Cook (WR) to distribute initial fields to test offices ISST will work with EMC and Kirby Cook (WR) to distribute initial fields to test offices Grids will be displayable in D2D and ingested into IFPS/GFE Grids will be displayable in D2D and ingested into IFPS/GFE Offices will provide feedback on pluses/minuses – i.e., help define the gap between this proof-of-concept and needed operational quality Offices will provide feedback on pluses/minuses – i.e., help define the gap between this proof-of-concept and needed operational quality Several offices from each CONUS Region will participate Several offices from each CONUS Region will participate Up and running around the first of year Up and running around the first of year Results will feedback into EMC development Results will feedback into EMC development Will provide test grids for developing gridded verification interfaces (BoiVer and FSL) Will provide test grids for developing gridded verification interfaces (BoiVer and FSL)

34 Phase II: Analysis of Record RTMA (improved version) will remain in place to support near-real-time needs of forecaster RTMA (improved version) will remain in place to support near-real-time needs of forecaster Considerable research, development, and operational infrastructure (computing) required for complete suite of analysis products of sufficient accuracy to verify NDFD Considerable research, development, and operational infrastructure (computing) required for complete suite of analysis products of sufficient accuracy to verify NDFD Best possible analysis will require 3 or 4-DVar Best possible analysis will require 3 or 4-DVar –Model needed to move info through space & time – WRF –Analysis capable of using ALL observations – GSI –Later data cut-off to acquire all obs – 18-48 hours –Should use obs precip to drive evolution of land-states lower boundary condition Combination of 3D-Var and full-physics model requires substantial additional computing resources Combination of 3D-Var and full-physics model requires substantial additional computing resources Initial steps started to address NWS AOR funding for FY 2008 and beyond, emphasizing NCEP (OSIP, PPBES, etc.) Initial steps started to address NWS AOR funding for FY 2008 and beyond, emphasizing NCEP (OSIP, PPBES, etc.) Additional funding sources need to be identified for broader community efforts to develop next-generation AOR Additional funding sources need to be identified for broader community efforts to develop next-generation AOR

35 Gridded MOS

36 Why do we need a gridded MOS? Why do we need a gridded MOS? When will it be available? When will it be available? What will gridded MOS include? What will gridded MOS include? How will we know if it is worth the effort? How will we know if it is worth the effort? How do I learn more about gridded MOS? How do I learn more about gridded MOS?

37 Gridded MOS ISST Whitepaper (2003) ISST Whitepaper (2003) To provide model bias correction of grids To provide model bias correction of grids High-resolution guidance High-resolution guidance Improve centralized guidance Improve centralized guidance

38

39 Gridded MOS Produced at 12Z and 00Z Produced at 12Z and 00Z 5 km grid (CONUS) ** 5 km grid (CONUS) ** Initial Development West of 110 degrees. West of 110 degrees. Max T, Min T Max T, Min T 3 hr Temp, Dew Point, & RH 3 hr Temp, Dew Point, & RH 12 hr PoP, (6 hr PoP) 12 hr PoP, (6 hr PoP) WRH will create IFPS SmartInits WRH will create IFPS SmartInits

40 Future Gridded MOS More Variables More Variables –Wind direction, Wind speed, 6hr Prob. Thunder, 12 hr Prob. Thunder, Snowfall, Sky –Anticipated full deployment of grids Fall 2006 Full CONUS (Summer 2006) Full CONUS (Summer 2006) Alaska (Fall 2007) Alaska (Fall 2007) Hawaii (Fall 2008) Hawaii (Fall 2008)

41 Gridded MOS Assessment ISST will conduct assessment for Gridded MOS ISST will conduct assessment for Gridded MOS Will start with Medford and then Western Region. Will start with Medford and then Western Region. Finally with the entire CONUS (OCONUS). Finally with the entire CONUS (OCONUS). Information on Gridded MOS http://www.nws.noaa.gov/mdl/synop/gmos.html http://www.nws.noaa.gov/mdl/synop/gmos.html

42 Day 4-7 Assessment

43 Day 4-7 Guidance Assessment The increasing number of sources (including derived sources and ensembles) of days 4-7 guidance necessitates that an objective evaluation of these sources be conducted in order to: The increasing number of sources (including derived sources and ensembles) of days 4-7 guidance necessitates that an objective evaluation of these sources be conducted in order to: –Foster effective decision making surrounding a proven starting point for the extended portion of the gridded forecast (which could be fluid) –Improve downscaling techniques (e.g. MatchGuidance, Regime PRISM) –Improve HPC gridded guidance (e.g., alternate first guess) –Facilitate bias corrected output –Facilitate assessment of new methods or models before implementation

44 Day 4-7 Guidance Assessment Suggested procedure Suggested procedure –Start soon (software, tools, and training requirements can be developed/defined) Alternative verification schemes should be used to effectively account for geographical timeliness constraints and varying field practice (i.e. when 4-7 day grids prepared versus available guidance) Alternative verification schemes should be used to effectively account for geographical timeliness constraints and varying field practice (i.e. when 4-7 day grids prepared versus available guidance)

45 Day 4-7 Guidance Assessment Suggested Procedure (continued) Suggested Procedure (continued) –Guidance to be evaluated initially: HPC Points HPC Points HPC Grids HPC Grids GFSX Grids GFSX Grids GFSX MOS GFSX MOS DGEX Grids DGEX Grids –Duration - 1 year with quarterly reports (possibly ongoing) –Verification of grid sources will be completed at non-traditional verification points to capture skill at non-MOS forecast locations –Conducted with grids when RTMA is available

46 Day 4-7 Guidance Assessment Assessment could be expanded to include other sources such as: Gridded MOS, ensemble MOS and other modeling center grids (e.g., CMC, EMCWF, UKMET, etc) Assessment could be expanded to include other sources such as: Gridded MOS, ensemble MOS and other modeling center grids (e.g., CMC, EMCWF, UKMET, etc) –Sources used by HPC for extended guidance production

47 Digital Forecast Process

48 Digital Forecast Process Field Evaluation Well over a year ago, the ISST set out to solicit field input on key issues related to the DFP Well over a year ago, the ISST set out to solicit field input on key issues related to the DFP We drafted a one-page document highlighting what we felt were the key issues We drafted a one-page document highlighting what we felt were the key issues Established an on-line forum for discussion (helpful, but limited input) Established an on-line forum for discussion (helpful, but limited input) Developed three overarching questions based on original document and on-line forum input Developed three overarching questions based on original document and on-line forum input ~1 year ago, created regional teams of SOOs and forecasters (initiated by Dave Sharp - SR ISST member at the time) ~1 year ago, created regional teams of SOOs and forecasters (initiated by Dave Sharp - SR ISST member at the time) Either one team per question, or in some cases one team addressed all three questions Either one team per question, or in some cases one team addressed all three questions

49 Digital Forecast Process Field Evaluation The three questions: The three questions: –“Within the limits of predictability, what are the optimal spatial and temporal resolutions needed to provide a useful and versatile digital service while maintaining scientific validity?” –“What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?” –"How should each NCEP center support the WFOs contribution to the digital forecast process?"

50 Digital Forecast Process Field Evaluation Input from the regional teams has now been compiled and summarized, and highlights are presented here Input from the regional teams has now been compiled and summarized, and highlights are presented here Over the next several months, the ISST will use the regional team input, add our input to the questions, and develop a series of documents (one addressing each question) with greater definition to the forecast process within the gridded paradigm Over the next several months, the ISST will use the regional team input, add our input to the questions, and develop a series of documents (one addressing each question) with greater definition to the forecast process within the gridded paradigm –Findings from the CONOPS Tiger Team will certainly have an influence –These documents will hopefully be an important resource as various D.S. teams forge ahead

51 Digital Forecast Process Field Evaluation Input from the regional teams was impressively thoughtful and thorough. Input from the regional teams was impressively thoughtful and thorough. These ideas and suggestions will weigh very heavily in the final ISST documents, and we hope ultimately are considered seriously by the groups making decisions on future directions for D.S. These ideas and suggestions will weigh very heavily in the final ISST documents, and we hope ultimately are considered seriously by the groups making decisions on future directions for D.S. We thank all the members of these teams! We thank all the members of these teams!

52 Question 1: “Within the limits of predictability, what are the optimal spatial and temporal resolutions needed to provide a useful and versatile digital service while maintaining scientific validity?” Digital Forecast Process Field Evaluation

53 Several teams argued for the need to give customers both high resolution (spatial and temporal) deterministic grids (LDFD concept), as well as coarser (in space and time) NDFD grids, with at least two teams suggesting that these resolutions change with increasing forecast hour (but AR and WR argued for consistent spatial resolution with time to depict impacts of complex terrain) Several teams argued for the need to give customers both high resolution (spatial and temporal) deterministic grids (LDFD concept), as well as coarser (in space and time) NDFD grids, with at least two teams suggesting that these resolutions change with increasing forecast hour (but AR and WR argued for consistent spatial resolution with time to depict impacts of complex terrain) Digital Forecast Process Field Evaluation Question 1: “Within the limits of predictability, what are the optimal spatial and temporal resolutions needed to provide a useful and versatile digital service while maintaining scientific validity?”

54 ER team made compelling (very thorough and well researched) argument for changing culture from product centric (deterministic) to a probabilistic system supporting decision assistance data, with sophisticated high resolution EPS serving as guidance backbone ER team made compelling (very thorough and well researched) argument for changing culture from product centric (deterministic) to a probabilistic system supporting decision assistance data, with sophisticated high resolution EPS serving as guidance backbone Other teams supported introducing uncertainty grids, especially beyond 72 hrs Other teams supported introducing uncertainty grids, especially beyond 72 hrs Idea of probability of exceedance thresholds for high impact events even in short term was expressed Idea of probability of exceedance thresholds for high impact events even in short term was expressed Digital Forecast Process Field Evaluation Question 1: “Within the limits of predictability, what are the optimal spatial and temporal resolutions needed to provide a useful and versatile digital service while maintaining scientific validity?”

55 Optimal Spatial Resolution Summary Optimal Spatial Resolution Summary –2/5 teams felt that 5 km was optimum given current tools and state of the applied science (and input grids should never be coarser than output grids) –3/5 teams felt that there was either no limit, or that resolutions <5 km could be used in a separate ‘native WFO grid’ (i.e., different from NDFD; perhaps a formalized LDFD) to resolve terrain and\or mesoscale processes vital to expressing any deterministic portion of the forecast Optimal Temporal Resolution Summary Optimal Temporal Resolution Summary –Of those teams who provided specific answers, the following conclusions were drawn: –0-6 hmin -> hourly –0-24 hhourly –25-72 h3 hourly –73-184 h6 or 12 h Digital Forecast Process Field Evaluation Question 1: “Within the limits of predictability, what are the optimal spatial and temporal resolutions needed to provide a useful and versatile digital service while maintaining scientific validity?”

56 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”

57 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”  Several common themes, which will be the focus of this brief summary  Many other good ideas expressed by only one team that are also worth consideration, and are included in our written summary, and will be addressed in final reports

58 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”  Better leadership at all levels - DSPO/DS-PAC critical in providing consistent, authoritative message about importance of accuracy and seamlessness  Standardization of methodologies and tools (including starting point) - Not easy, since some local flexibility still needed  Improved communication - more sub-regional workshops, inter-office team building, forecaster exchange?

59 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”  Improved collaboration - graphical sharing, conf calls, video conferencing, earlier/better collaboration on large scale before grid editing  LDFD/RDFD/NDFD concept - robust smoothing required to go upscale, or “drilling” down  Gridded verification!!! - must include real-time feedback - better accuracy will help result in better collaborated grids - RTMA/AoR critical here

60 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”  Use of ISC grids - exchange more frequently, use ISC mode appropriately  Internal/External Grid Integrity - enforce some consistency checks - better define collaboration thresholds (science based)  Forecast accuracy - back to need for gridded verification - emphasis on accuracy as a way of improving collaboration - accuracy incentives (in addition to collaboration “smiley faces”)

61 Digital Forecast Process Field Evaluation Question 2: “What is the best way to minimize discrepancies and produce a near-seamless NDFD without sacrificing accuracy and consistency?”  Centrally-provided meteorological fields, from which sensible weather elements are locally derived and adjusted - requires some more thought to flesh out specific recommendation - related to DFP Qstn 3

62 Digital Forecast Process Field Evaluation Question 3: ”How should each NCEP center support the WFOs contribution to the digital forecast process?”

63 Digital Forecast Process Field Evaluation Question 3: ”How should each NCEP center support the WFOs contribution to the digital forecast process?” Common themes:  All guidance in gridded format, and matching NDFD resolution if possible  Probabilistic grids for certain parameters  More ensemble data, and in gridded format

64 Digital Forecast Process Field Evaluation Question 3: ”How should each NCEP center support the WFOs contribution to the digital forecast process?” Common requirements of each center:  CPC: Climatology and stand dev grids  EMC: AoR, model output at native resolution, local modeling support  HPC: Day 4-7 grids  OPC: Offshore grids  SPC: Products in gridded format  TPC: Maintain TCM wind grids

65 Digital Forecast Process Field Evaluation Questions/Comments about the summary so far, or our process and plans for the DFP material?

66 Open Discussion


Download ppt "IFPS Science Steering Team Forum November 8 th, 2005."

Similar presentations


Ads by Google