Working group III report Post-processing. Members participating Arturo Quintanar, John Pace, John Gotway, Huiling Yuan, Paul Schultz, Evan Kuchera, Barb.

Slides:



Advertisements
Similar presentations
Model Evaluation Tools MET. What is MET Model Evaluation Tools ( MET )- a powerful and highly configurable verification package developed by DTC offering:
Advertisements

Systems Analysis, Prototyping and Iteration Systems Analysis.
More CMM Part Two : Details.
Statistical post-processing using reforecasts to improve medium- range renewable energy forecasts Tom Hamill and Jeff Whitaker NOAA Earth System Research.
User Perspective: Using OSCER for a Real-Time Ensemble Forecasting Experiment…and other projects Currently: Jason J. Levit Research Meteorologist, Cooperative.
Evaluating Training. Aims 1. To share and explore purposes and processes for evaluation 2. To examine attitudes and beliefs about evaluation.
Downscaling ensembles using forecast analogs Jeff Whitaker and Tom Hamill
Instituting Reforecasting at NCEP/EMC Tom Hamill (ESRL) Yuejian Zhu (EMC) Tom Workoff (WPC) Kathryn Gilbert (MDL) Mike Charles (CPC) Hank Herr (OHD) Trevor.
Meeting Expectations Gary Jedlovec Purpose of review SPoRT Mission and Vision Role of Science Advisory Committee Charge to Committee members transitioning.
CMP3265 – Professional Issues and Research Methods Research Proposals: n Aims and objectives, Method, Evaluation n Yesterday – aims and objectives: clear,
7.2 System Development Life Cycle (SDLC)
S R S S ystem R equirements S pecification Specifying the Specifications.
ISO 9000 Certification ISO 9001 and ISO
TC176/IAF ISO 9001:2000 Auditing Practices Group.
The Project AH Computing. Functional Requirements  What the product must do!  Examples attractive welcome screen all options available as clickable.
SYSTEM ANALYSIS AND DESIGN
The Key Process Areas for Level 2: Repeatable Ralph Covington David Wang.
AICT5 – eProject Project Planning for ICT. Process Centre receives Scenario Group Work Scenario on website in October Assessment Window Individual Work.
New Advanced Higher Subject Implementation Events Design and Manufacture: Advanced Higher Course Assessment.
1 Ensemble Reforecasts Presented By: Yuejian Zhu (NWS/NCEP) Contributors:
Chapter 9 Database Planning, Design, and Administration Sungchul Hong.
Database System Development Lifecycle © Pearson Education Limited 1995, 2005.
University of Palestine software engineering department Testing of Software Systems Fundamentals of testing instructor: Tasneem Darwish.
Warn on Forecast Briefing September 2014 Warn on Forecast Brief for NCEP planning NSSL and GSD September 2014.
1 Addressing Critical Skills Shortages at the NWS Environmental Modeling Center S. Lord and EMC Staff OFCM Workshop 23 April 2009.
Ed Tollerud, Tara Jensen, Barb Brown ( also Yuejian Zhu, Zoltan Toth, Tony Eckel, Curtis Alexander, Huiling Yuan,…) Module 6 Objective: Provide a portable,
Woody Roberts Tom LeFebvre Kevin Manross Paul Schultz Evan Polster Xiangbao Jing ESRL/Global Systems Division Application of RUA/RTMA to AWIPS and the.
ExOB Discussions on Development Test Center WRF ExOB Meeting U.S. Naval Observatory, Washington, D.C. 28 April 2006.
Verification and Validation in the Context of Domain-Specific Modelling Janne Merilinna.
Georgia Institute of Technology CS 4320 Fall 2003.
NOAA Project Update Overall Project Goal: Improved regional relevance and usability of climate and sea level rise data and tools for the specific needs.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Notes on reforecasting and the computational capacity needed for future SREF systems Tom Hamill NOAA Earth System Research Lab presentation for 2009 National.
1 Agenda Topic: National Blend Presented By: Kathryn Gilbert (NWS/NCEP) Team Leads: Dave Myrick, David Ruth (NWS/OSTI/MDL), Dave Novak (NCEP/WPC), Jeff.
1 An overview of the use of reforecasts for improving probabilistic weather forecasts Tom Hamill NOAA / ESRL, Physical Sciences Div.
The Interactive Model Of Program Planning
Chap. 5 Building Valid, Credible, and Appropriately Detailed Simulation Models.
Market Research & Product Management.
A THORPEX Pacific Predictability Experiment: User and Social Science Research Perspectives Rebecca E. Morss National Center for Atmospheric Research (Thanks.
APPLICATION OF NUMERICAL MODELS IN THE FORECAST PROCESS - FROM NATIONAL CENTERS TO THE LOCAL WFO David W. Reynolds National Weather Service WFO San Francisco.
Business Analysis. Business Analysis Concepts Enterprise Analysis ► Identify business opportunities ► Understand the business strategy ► Identify Business.
GIFS-FDP Introduction to Framework Plan and links with SWFDP Richard Swinbank & Zoltan Toth.
Copyright © 2014 by The University of Kansas Collecting and Using Archival Data.
Reanalysis Needs for Climate Monitoring Craig S Long Arun Kumar and Wesley Ebisuzaki CPC NOAA Climate Test Bed (CTB) Meeting – November 9-10, NCWCP.
NWRFC Water Supply Directions for Steve King Northwest River Forecast Center.
Monitoring Activities Answer the Following Questions: What is being done? Scope How well is it being done? Quality Are we doing it on a large enough scale?
ANALYSIS PHASE OF BUSINESS SYSTEM DEVELOPMENT METHODOLOGY.
Science plan S2S sub-project on verification. Objectives Recommend verification metrics and datasets for assessing forecast quality of S2S forecasts Provide.
MOS and Evolving NWP Models Developer’s Dilemma: Frequent changes to NWP models… Make need for reliable statistical guidance more critical Helps forecasters.
Csci 418/618 Simulation Models Dr. Ken Nygard, IACC 262B
DTC Overview Bill Kuo September 25, Outlines DTC Charter DTC Management Structure DTC Budget DTC AOP 2010 Processes Proposed new tasks for 2010.
Fly - Fight - Win 2 d Weather Group Mr. Evan Kuchera HQ AFWA 2 WXG/WEA Template: 28 Feb 06 Approved for Public Release - Distribution Unlimited AFWA Ensemble.
Pertemuan 14 Matakuliah: A0214/Audit Sistem Informasi Tahun: 2007.
TC176/IAF ISO 9001:2000 Auditing Practices Group.
WORKSHOP ON ACCREDITATION OF BODIES CERTIFYING MEDICAL DEVICES INT MARKET TOPIC 9 CH 8 ISO MEASUREMENT, ANALYSIS AND IMPROVEMENT INTERNAL AUDITS.
WORKSHOP ON ACCREDITATION OF BODIES CERTIFYING MEDICAL DEVICES INT MARKET TOPIC 6 CH 5 ISO MANAGEMENT RESPONSIBILITY Philippe Bauwin Medical.
Chapter 9 Database Planning, Design, and Administration Transparencies © Pearson Education Limited 1995, 2005.
Evaluation Planning Checklist (1 of 2) Planning Checklist Planning is a crucial part of the evaluation process. The following checklist (based on the original.
Agenda  Purpose  Processes  Deliverables  Executing Activities 4.3.
February 25, Today’s Agenda  Introductions  USDOE School Improvement Information  Timelines and Feedback on submitted plans  Implementing plans.
Software Quality Control and Quality Assurance: Introduction
Controlling Measuring Quality of Patient Care
FORMAL SYSTEM DEVELOPMENT METHODOLOGIES
Version 0.1Assessment Method Overview - 1 Process Assessment Method An objective model-independent method to assess the capability of an organization to.
Teaching with Instructional Software
Using the Checklist for SDMX Data Providers
CLINICAL INFORMATION SYSTEM
MECH 3550 : Simulation & Visualization
Building Valid, Credible, and Appropriately Detailed Simulation Models
Your name here Your institution here
Presentation transcript:

Working group III report Post-processing

Members participating Arturo Quintanar, John Pace, John Gotway, Huiling Yuan, Paul Schultz, Evan Kuchera, Barb Brown, Keith Brill, Kathy Gilbert, Adrian Raftery, Mike Sestak, Zoltan Toth

Ensemble prediction + post- processing: a necessary marriage. In the past, post-processing was an afterthought in end-to-end design. Opportunity to change this with SREF. – Need to plan supercomputer resources to conduct adequate reforecasts – Need to prep the observational / analysis data sets – Need to provide codes. – Need to work together, modelers + statisticians.

Ensemble post-processing proposal (already proposed within NOAA) Requirements Definition - Define problems to work on based on user feedback (e.g., post-process for icing), format for output, rules for evaluation. Collect data sets (reforecasts, obs, analyses), existing algorithms to be used in testing and developing post- processing algorithms. Develop algorithms (ESRL, MDL, EMC, academia, etc). DTC to verify, synthesize, recommend candidates for tech transition. MDL and EMC implement. Monitor (independent organization like DTC?)

Some general considerations SREF change management process needs to change. How do we allow non-NCEP personnel to weigh in on configuration? How do we ensure that post-processing concerns end up being reflected in the overall SREF system design rather than be an afterthought? – Multi-model may be preferable when considering raw ensemble guidance as an end in itself, less desirable when post-processing is part of the equation. More models, more reforecasts needed. – Changing SREF member characteristics  good for raw forecasts, complicates the post- processing. Need standard method for doing this, agreement to have training data sets in place before model changes. Role of human involvement in modifying post-processed guidance still up for debate. – Opn’l proving ground in NOAA offers way to sort this out. – Roles may be changing with time. Inherit deterministic forecaster role, tough transition – Forecaster role may emphasize the high-impact events. Challenge: post-processed guidance needed in regions with no observations. Existing techniques developed for data-rich environment; crucial problems in data-poor. For many, need access to raw data, not post-processed output. That way, different users can massage the data to their own needs.

(1) Requirements definition Need to carefully understand how users will interact with the data, what format allows them to make best decisions. – Expression of output info (pdfs, bias corrected members); many users, many desired formats. – A good point to collaborate with social scientists. Decide whether training and evaluation will be done with observations and/or analyses. Perhaps both, but end product will include gridded, so important to evaluate in the end its skill in gridded context. – Obs need to be gathered from many new sources; e.g.,NCAR has been collecting aircraft data. Requirements will specify that inconsistencies in variables (temp/dewpoint) must be handled. Preferable that software works well for rare + common events, not require radical changes with changing model configurations.

Collect data sets (reforecasts, obs, analyses), existing algorithms. Need common data set for inter-comparing post-processing methods Need reforecasts – The length of the training data set will depend on what variable is considered – Unfortunately, for parallel post-proc and SREF system development, post-proc must be done with earlier version of model than what will be operational several years hence. Need to determine what obs data sets to use, what new ones we need to develop in order to do post-proc for variables of interest. Will collect existing codes (MDL’s regression, NCEP’s stepwise bias correction & downscaling, Adrian’s BMA, etc.). Reanalyses (“analysis of record”) needed.

Develop algorithms Funding provided inside NOAA & to academic community to develop advanced techniques according to requirements above, using data sets above. Still a lot to be learned inter-comparing methods, especially for the unusual variables, discontinuous variables.

DTC to verify, synthesize, recommend candidates for tech transition. Need standard verification procedures, access to statistics Recommended algorithms for tech transition might be blend or refinement of ideas from those submitted.

MDL and NCEP/EMC implement DTC recommendations, run real-time. This is a labor-intensive step. – Post-processing software must be robust, maintainable,

Monitor (DTC again?) Independent organization, or by the producing organization? – Note: NEXGEN program is funding independent verification system development. – Note: NCEP, MDL already have in-house verification software. Standard verification statistics, and made readily available to the community