Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal.

Slides:



Advertisements
Similar presentations
Emtool Centre for the Analysis of Time Series London School of Economics Jochen Broecker & Liam Clarke ECMWF Users Meeting June 2006.
Advertisements

MSc Dissertation Writing
Reliability of the electrical service Business Continuity Management Business Impact Analysis (BIA) Critical ITC Services Minimum Business Continuity Objective.
How IEP Teams Make Assessment Accommodation Decisions: Rhode Island’s Research Findings Paul V. Sherlock Center on Disabilities at Rhode Island College.
SIPR Dundee. © Crown copyright Scottish Flood Forecasting Service Pete Buchanan – Met Office Richard Maxey – SEPA SIPR, Dundee, 21 June 2011.
Gridded OCF Probabilistic Forecasting For Australia For more information please contact © Commonwealth of Australia 2011 Shaun Cooper.
OHT 10.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 The testing process Determining the test methodology phase Planning.
The Rational Decision-Making Process
Empowering Staff Through Institute Planning (ESTIP) Executive Workshop Institute Name: XXXXXX Presenter: XXXXXX Date: XXXXXX.
Exploring the Use of Object- Oriented Verification at the Hydrometeorological Prediction Center Faye E. Barthold 1,2, Keith F. Brill 1, and David R. Novak.
FORECASTING EASTERN US WINTER STORMS Are We Getting Better and Why? Jeff S. Waldstreicher NOAA/NWS Eastern Region Scientific Services Division – Bohemia,
Week:#14 Windows Recovery
IMS Information Systems Development Practices
Maintaining and Updating Windows Server 2008
Understanding End User Role in PDF Accessibility Brad Hodges, AFB Consulting Pete De Vasto, Adobe Systems.
Students: Nadia Goshmir, Yulia Koretsky Supervisor: Shai Rozenrauch Industrial Project Advanced Tool for Automatic Testing Final Presentation.
HOGAN REPORT OPTIONS November HoganSelect reports use personality assessment to: Identify candidates’ work style Understand their core drivers Recognize.
Monday 13 th November GSY/050388/ © BAE SYSTEMS All Rights Reserved ESA Space Weather Applications Pilot Project Service Development.
1 Software Testing (Part-II) Lecture Software Testing Software Testing is the process of finding the bugs in a software. It helps in Verifying and.
Strategies for Ensuring Optimal Guidance in Decision Support Systems for Winter Maintenance Operations Kevin Petty, Daniel Johns, Paul Bridge, Mikko Siitonen,
INTERNATIONAL LABOUR ORGANIZATION Conditions of Work and Employment Programme (TRAVAIL) 2012 Module 15: Capacity development and training on Maternity.
Qualifications Update: Environmental Science Qualifications Update: Environmental Science.
State Records Office of Western Australia.NET Proof of Concept Project Slideshow: Prototype Online Disposal Authority/Recordkeeping Plan System Project.
Development of Indicators for Integrated System Validation Leena Norros & Maaria Nuutinen & Paula Savioja VTT Industrial Systems: Work, Organisation and.
2015 S TATE E LECTIONS C ONFERENCE T ESTING YOUR COOP June 10, 2015.
April nd IBTrACS Workshop 1 Operational Procedures How can we build consistent, homogeneous, well- documented climate quality data?
Event Management & ITIL V3
What is a Business Analyst? A Business Analyst is someone who works as a liaison among stakeholders in order to elicit, analyze, communicate and validate.
Sponsor: Dr. K.C. Chang Tony Chen Ehsan Esmaeilzadeh Ali Jarvandi Ning Lin Ryan O’Neil Spring 2010.
Woody Roberts Tom LeFebvre Kevin Manross Paul Schultz Evan Polster Xiangbao Jing ESRL/Global Systems Division Application of RUA/RTMA to AWIPS and the.
Sea Ice Modelling and Data Assimilation at CIS Tom Carrieres.
SWFDP-Eastern Africa November 2011 NOAA/NCEP African Desk Product Surfing Presented by Hamza Kabelwa Prepared by Richard H. Grumm Contributions by Vadlamani.
HPN: IFSS1 Intelligent Flight Support System (IFSS) A Real-Time Intelligent Decision Support Prototype PRESENTER/COTR Anthony Bruins (X37071) HPN Software.
Celeste Saulo and Juan Ruiz CIMA (CONICET/UBA) – DCAO (FCEN –UBA)
Artificial Intelligence in Game Design N-Grams and Decision Tree Learning.
AMERICAN METEOROLOGICAL SOCIETY 1 Harvey Stern 22 nd Conference on Interactive Information and Processing Systems, Atlanta, 30 January, Combining.
Project Portfolio Management Business Priorities Presentation.
Weather Event Simulator Best Practices John Ferree Warning Decision Training Branch Norman, OK John Ferree Warning Decision Training Branch Norman, OK.
17 TH BMRC MODELLING WORKSHOP – OCTOBER Harvey Stern 17 th BMRC Modelling Workshop, Bureau of Meteorology, Melbourne, 6 October, Generating.
CBRFC Stakeholder Forum February 24, 2014 Ashley Nielson Kevin Werner NWS Colorado Basin River Forecast Center 1 CBRFC Forecast Verification.
Toulouse, September 2003 Page 1 JOURNEE ALTARICA Airbus ESACS  ISAAC.
CLIMATE SERVICE DIVISION / OCWWS / NWS L3MTO QC and Accuracy Marina Timofeyeva Contributors: Annette Hollingshead, Dave Unger and Andrea Bair.
Metadata By N.Gopinath AP/CSE Metadata and it’s role in the lifecycle. The collection, maintenance, and deployment of metadata Metadata and tool integration.
Using a Business Case approach and Capabilities as a basis for evaluating a Training Program Roger Deslandes Bureau of Meteorology Training Centre EumetCAL.
Decision-making aspects in weather forecasting Training towards expertise PROJECT PHOENIX – a snapshot NOMEK 2011 Erik Hagemark Based on presentations.
The WMO GAW Urban Research Meteorology and Environment Project -- GURME.
1. October 25, 2011 Louis Everett & John Yu Division of Undergraduate Education National Science Foundation October 26, 2011 Don Millard & John Yu Division.
4th EUMETCAL Workshop, Toulouse 26 – 28 August 2008 Opportunities and Challenges The WMO Education and Training Programme Jeff Wilson.
Unit 17: SDLC. Systems Development Life Cycle Five Major Phases Plus Documentation throughout Plus Evaluation…
Proposed THORPEX/HEPEX Hydrologic Ensemble Project (THEPS) Presentation for 3 rd THORPEX Science Symposium September 14-18, 2009 Prepared by John Schaake,
Forecasting Oceanic Cyclones at the NOAA Ocean Prediction Center Joseph M. Sienkiewicz, D. Scott Prosise, and Anthony Crutch NOAA/NWS/NCEP/Ocean Prediction.
Establishing (or Enhancing) PMO Effectiveness Nicolle Goldman, PMP March 28, 2007.
1 Creating Situational Awareness with Data Trending and Monitoring Zhenping Li, J.P. Douglas, and Ken. Mitchell Arctic Slope Technical Services.
Maintaining and Updating Windows Server 2008 Lesson 8.
December 13,  Current policy  Required employees to complete Individual Development Plans (IDPs) for Milestone and Exemplary Merit increases.
VLMG – 5: Beijing CMA R.Deslandes WMO – 258. VLMG – 5: Beijing CMA R.Deslandes Session Objective Review team Structure / rationale of latest draft Discussion:
Project Office Effectiveness Educating the Organization on How to Use a PMO February 22 nd, 2006.
WATER RESOURCES DEPARTMENT
OHT 10.1 Galin, SQA from theory to implementation © Pearson Education Limited 2004 The testing process Determining the test methodology phase Planning.
Friends and Partners of Aviation Weather Summer 2016 Meeting
Critical Factors in Managing Technology
FORECASTING EASTERN US WINTER STORMS Are We Getting Better and Why?
Update on the Status of Numerical Weather Prediction
Meteorological applications and numerical models becoming increasingly accurate Actual observing systems provide high resolution data in space and time.
by Xiang Mao and Qin Chen
Linking operational activities and research
Technology Planning.
La Plata Basin Originated from sub-seasonal workshop focussing on link seamless prediction climate to weather.
Workshop.
Presentation transcript:

Project Phoenix – An Update EuMetCAL Workshop Toulouse, August 2008 Jaymie Gadal

DRAFT – Page 2 – November 14, 2015 Phoenix – The Experiment Original project Phoenix was an experiment: –To determine if the human forecaster remained relevant in the forecast process –To determine if the forecaster retained a usable measure of skill, without NWP –To assess the areas of strength and weakness of the forecaster –To get an idea of what the Role of the Forecaster should be

DRAFT – Page 3 – November 14, 2015 Phoenix Experiment - Methodology Forecasters were put in a parallel ‘office’ – access to NWP was denied Forecasts generated without NWP, but using the same preparation tool – Scribe 3 Outputs: automatic, official, and Phoenix team forecast verification results compared using very intensive scoring system Successes & Failures analyzed, in real time, and as trends over time

DRAFT – Page 4 – November 14, 2015 Phoenix Experiment - Results Forecaster still did have significant ability to forecast better than models in many circumstances – but which? Previous attempts to determine where the forecaster should concentrate were found to be simplistic Greatest strengths stem from analysis, diagnosis, prognosis, situational awareness

DRAFT – Page 5 – November 14, 2015 A Few Particular & Interesting Results When forecasters were ‘right’, they tended to ‘hedge’ Whole lot of tweaking going on, very little successful Junior forecasters often perform better than experienced ones – why? Some parameters/situations best left alone Some model weaknesses identified

DRAFT – Page 6 – November 14, 2015 Phoenix - Conclusions Forecaster needs to know on a parameter-by- parameter basis when to intervene Operational routines must stress: –ADP prior to consultation of the models –Constant situational awareness Real-time verification more critical than previously Optimum Human/Machine mix remains unclear – no clear division of roles

DRAFT – Page 7 – November 14, 2015 Notes Regarding Severe Weather & Warning Situations Role of the forecaster in times of extreme weather remains more traditional –NWP continues to fail to capture extreme events – forecaster intervention often required –Forecaster interpretation of NWP results strongly required –ADP even more critical –SA & rapid response remain essential

DRAFT – Page 8 – November 14, 2015 Role of the Forecaster Choosing best model? Short range only? High impact weather only? Interpretation of models? Consultation? Integration of ensemble prediction results? Tuning of automatic forecasting algorithms?

DRAFT – Page 9 – November 14, 2015 Phoenix – The Training Program Experiment converted into a training simulator Combatting “Meteorological Cancer” Forecasters were employing an operational routine which effectively by-passed their strengths Return to ADP/SA results in improved forecaster performance

DRAFT – Page 10 – November 14, 2015 Steps Taken 2007/2008 – All MSC forecasters given a week of Phoenix simulator training Phoenix simulator training given to all new forecasters Offices continuing to give staff Phoenix simulator training periodically Simulator extended into real-time by automation of the scoring system

DRAFT – Page 11 – November 14, 2015 Phoenix Scoring System Error score, essentially forecast-observed Error scores are normalized between parameters Scores for different time periods, parameters, locations can be weighted according to relative importance Relative importance determined from user surveys Parallels forecaster’s intention (not a ‘proper’ score) Scores are rolled up for summary and can be drilled-down to discover ‘root-causes’ Output available in xml for greater analysis

DRAFT – Page 12 – November 14, 2015 Example Output Scores 23_SITES_AM_ISSUE_ scores_score.xls Phoenix Monthly Report.xls

DRAFT – Page 13 – November 14, 2015 Automated Phoenix Scoring Output generated in near-real time for 2 dozen stations daily User interface for generating scores for any given forecast Possibility of doing studies seasonally, situationally-dependent, individualized Option of configuring scores for different users

DRAFT – Page 14 – November 14, 2015 Other Uses of Phoenix Simulations have been run using the methodology to investigate: Severe weather forecasting, Aviation, Extended range, Marine Results & conclusions generally the same, with different levels of intervention required Other types of simulators Research Validation

DRAFT – Page 15 – November 14, 2015 What’s Next Completion of the automation of the full range or parameters, expansion to other forecasts Develop Weather Event Simulator capability using Ninjo, with Phoenix for evaluation Evaluation of training Use in QA for ISO certification Extended to more parameters and forecast types in real time More comprehensive use in operations management, identification of needed research, assessment of training needs