© Crown copyright Met Office Forecasting Icing for Aviation: Some thoughts for discussion Cyril Morcrette Presented remotely to Technical Infra-structure.

Slides:



Advertisements
Similar presentations
© Crown copyright Met Office Use of GRIB hazard forecasts in flight planning Bob Lunnon, Aviation Outcomes Manager, Met Office WAFS Science meeting, Washington,
Advertisements

Anthony Illingworth, + Robin Hogan, Ewan OConnor, U of Reading, UK and the CloudNET team (F, D, NL, S, Su). Reading: 19 Feb 08 – Meeting with Met office.
Robin Hogan Ewan OConnor Cloudnet level 3 products.
6th WMO tutorial Verification Martin GöberContinuous 1 Good afternoon! नमस्कार नमस्कार Guten Tag! Buenos dias! до́брый день! до́брыйдень Qwertzuiop asdfghjkl!
PRESENTS: FORECASTING FOR OPERATIONS AND DESIGN February 16 th 2011 – Aberdeen.
ASII-NG: Developments and outlook NWCSAF 2015 Users Workshop.
The German In-flight Icing Warning System ADWICE - current status, on-going research & development and future challenges In-Flight Icing Users Technical.
© Crown copyright Met Office ACRE working group 2: downscaling David Hein and Richard Jones Research funded by.
1 00/XXXX © Crown copyright Use of radar data in modelling at the Met Office (UK) Bruce Macpherson Mesoscale Assimilation, NWP Met Office EWGLAM / COST-717.
Does the aircraft direction of movement affect its response to (moderate or greater) turbulence? Bob Lunnon Ex UK Met Office/Univ of Reading.
Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI International Robert J. Mislevy & Min Liu University of Maryland Geneva Haertel SRI.
00/XXXX 1 Met Office requirements for fog & low cloud forecasts B.Golding Head of Forecasting Research Met Office, UK.
© Crown copyright Met Office Overview of Cloud Scheme Developments at the Met Office Cyril Morcrette November 2009, Reading University.
Verification of Numerical Weather Prediction systems employed by the Australian Bureau of Meteorology over East Antarctica during the summer season.
MIT ICAT MIT ICAT. MIT ICAT MIT ICAT Motivation Adverse Weather Significantly Impacts Flight Operations  Safety % All US Accidents  Efficiency.
A perspective on aircraft icing weather research Marcia K. Politovich National Center for Atmospheric Research, Boulder, CO For Icing TIM Washington, DC.
Chapter 13 – Weather Analysis and Forecasting. The National Weather Service The National Weather Service (NWS) is responsible for forecasts several times.
© Crown copyright Met Office Improving forecasting of disruption due to convection within UK airspace Paul Maisey and Katie Brown ECAM, European Meteorological.
Weather Technology in the Cockpit (WTIC) - Uncertainty/Probabilistic Information in the Cockpit Projects July 21, 2014 Gary Pokodner 1 Friends and Partners.
Writing Better Aviation AFDs (or) Do you know who your customer really is ? National Weather Service, Jackson, KY Dustin Harbage And Brian Schoettmer.
Press F5 to begin Power Point Click to start show.
Verification has been undertaken for the 3 month Summer period (30/05/12 – 06/09/12) using forecasts and observations at all 205 UK civil and defence aerodromes.
Standardized Testing (1) EDU 330: Educational Psychology Daniel Moos.
© Crown copyright Met Office Operational OpenRoad verification Presented by Robert Coulson.
Forecasting and Numerical Weather Prediction (NWP) NOWcasting Description of atmospheric models Specific Models Types of variables and how to determine.
CAUSES (Clouds Above the United States and Errors at the Surface) "A new project with an observationally-based focus, which evaluates the role of clouds,
LECTURE 6 OTHER POTENTIAL THREATS AVIATION SAFETY AND SECURITY.
Federal Department of Home Affairs FDHA Federal Office of Meteorology and Climatology MeteoSwiss High-resolution data assimilation in COSMO: Status and.
The vertical resolution of the IASI assimilation system – how sensitive is the analysis to the misspecification of background errors? Fiona Hilton and.
Preliminary Results of Global Climate Simulations With a High- Resolution Atmospheric Model P. B. Duffy, B. Govindasamy, J. Milovich, K. Taylor, S. Thompson,
Copyright © by Holt, Rinehart and Winston. All rights reserved. Chapter 2: Skills for a Healthy Life 1.I review all of my choices before I make a decision.
© Crown copyright Met Office Providing High-Resolution Regional Climates for Vulnerability Assessment and Adaptation Planning Joseph Intsiful, African.
LEVEL 3 I can identify differences and similarities or changes in different scientific ideas. I can suggest solutions to problems and build models to.
Lesson Overview 1.2 Science in Context.
Impact Of Surface State Analysis On Estimates Of Long Term Variability Of A Wind Resource Dr. Jim McCaa
© Crown copyright Met Office Plans for Met Office contribution to SMOS+STORM Evolution James Cotton & Pete Francis, Satellite Applications, Met Office,
GLFE Status Meeting April 11-12, Presentation topics Deployment status Data quality control Data distribution NCEP meeting AirDat display work Icing.
Julie Haggerty National Center for Atmospheric Research Friends and Partners of Aviation Weather October July 2014.
The Nature of Science & Science Skills Test Review.
© Crown copyright Met Office Space Weather Mr John Hirst, Permanent Representative of the UK to WMO 17 th May 2011.
© Crown copyright Met Office Probabilistic turbulence forecasts from ensemble models and verification Philip Gill and Piers Buchanan NCAR Aviation Turbulence.
© Crown copyright Met Office Met Office activities related to needs of humanitarian agencies Anca Brookshaw.
Modern Era Retrospective-analysis for Research and Applications: Introduction to NASA’s Modern Era Retrospective-analysis for Research and Applications:
CC-07: Atmospheric Carbon Dioxide Concentrations and Weather: Evidence from Hawaii Kevin F. Forbes The Catholic University of America Washington, DC .
Copyright © by Holt, Rinehart and Winston. All rights reserved. Chapter 2: Skills for a Healthy Life 1.I review all of my choices before I make a decision.
The Nature of Science & Science Skills Test Review.
Observed & Simulated Profiles of Cloud Occurrence by Atmospheric State A Comparison of Observed Profiles of Cloud Occurrence with Multiscale Modeling Framework.
Weather Forecasting 4.8A Measure and record changes in weather and make predictions using weather maps, weather symbols, and a map key.
© Crown copyright Met Office Downscaling ability of the HadRM3P model over North America Wilfran Moufouma-Okia and Richard Jones.
Page 1© Crown copyright 2005 Damian Wilson, 12 th October 2005 Assessment of model performance and potential improvements using CloudNet data.
Challenges and Strategies for Combined Active/Passive Precipitation Retrievals S. Joseph Munchak 1, W. S. Olson 1,2, M. Grecu 1,3 1: NASA Goddard Space.
The Nature of Science & Science Skills Test Review.
Reducing the risk of volcanic ash to aviation Natalie Harvey, Helen Dacre (Reading) Helen Webster, David Thomson, Mike Cooke (Met Office) Nathan Huntley.
Opportunities for Satellite Observations in Validation and Assessment Barbara Brown NCAR/RAL 19 November 2008.
Sol-Terra: A Roadmap to Operational Sun-to- Earth Space Weather Forecasting Mike Marsh 1, David Jackson 1, Alastair Pidgeon 2, Gareth Lawrence 2, Simon.
Evaluating EVM January 13, EVM Issues and Opportunities ▪ EVM Basics ▪ EVM Issues ▪ EVM ETC and EAC Calculations ▪ EVM Cost and Schedule Value Calculations.
1) Verification of individual predictors Development of improved turbulence forecasts for aviation Debi Turp © Crown copyright Met Office and the Met Office.
Unit 1: The Nature of Science. Earth Science 1. What is science? 1.Science is the a process of observing, studying, and thinking about things in your.
King Faisal University جامعة الملك فيصل Deanship of E-Learning and Distance Education عمادة التعلم الإلكتروني والتعليم عن بعد [ ] 1 جامعة الملك فيصل عمادة.
© Crown copyright Met Office Systematic Biases in Microphysics: observations and parametrization Ian Boutle, Steven Abel, Peter Hill, Cyril Morcrette QJ.
SIGMA: Diagnosis and Nowcasting of In-flight Icing – Improving Aircrew Awareness Through FLYSAFE Christine Le Bot Agathe Drouin Christian Pagé.
Impact of the representation of the stratosphere on tropospheric weather forecasts Sana Mahmood © Crown copyright 07/0XXX Met Office and the Met Office.
© Crown copyright Met Office Technical developments at the Met Office Matthew Glover.
Atmosphere Clouds Aerosols Ozone GHG Issues and remarks (synthesis after presentations and discussion)
Plans for Met Office contribution to SMOS+STORM Evolution
The Lake Victoria Intense storm Early Warning System (VIEWS)
Press F5 to begin Power Point
Challenge: High resolution models need high resolution observations
Linking operational activities and research
Measuring the performance of climate predictions
Presentation transcript:

© Crown copyright Met Office Forecasting Icing for Aviation: Some thoughts for discussion Cyril Morcrette Presented remotely to Technical Infra-structure Meeting in Washington DC, 26 February 2015 from the Met Office, Exeter, United Kingdom.

© Crown copyright Met Office Current Icing Algorithm used by “WAFC-London” (i.e. used by Met Office in the UK) Icing index=relative humidity if -20  C <T<0  C and cloud is present. Pros: Simple Captures broad atmospheric conditions we are interested in. Cons: Simple Does not make use of: value of cloud fraction liquid water content information about local cloud variability or microphysics.

© Crown copyright Met Office As an atmospheric model developer: good idea of the quality of information that a numerical weather prediction (NWP) model can produce. Know about level of complexity assumed in the representation of physical processes in the atmosphere. But I don’t know very much about what the end-user of the icing forecast really needs. What would they really want. Our forecast model can provide a lot of information. But what information would be most useful from the point of view of aviation icing and its forecasting.

© Crown copyright Met Office Knowledge about capabilities of state-of-the-art numerical weather prediction model Knowledge about how icing affects flight. A very simple schematic NWP model developer Pilot

© Crown copyright Met Office Knowledge about capabilities of state-of-the-art numerical weather prediction model Knowledge about how icing affects flight. A very simple schematic NWP model developer Pilot

© Crown copyright Met Office Knowledge about capabilities of state-of-the-art numerical weather prediction model Knowledge about how icing affects flight. A very simple schematic NWP model developer Pilot Is there a communication gap, between NWP developer and pilot and vice-versa. Is the dialogue along the red line? (Going through a point where the icing product is a poor reflection of weather prediction capabilities and a poor reflection of that the pilots need).

© Crown copyright Met Office Knowledge about capabilities of state-of-the-art numerical weather prediction model Knowledge about how icing affects flight. A very simple schematic NWP model developer Pilot Is there a communication gap, between NWP developer and pilot and vice-versa. Is the dialogue along the red line? (Going through a point where the icing product is a poor reflection of weather prediction capabilities and a poor reflection of that the pilots need). Could the dialogue be more along the green diagonal (Ensuring the icing product is the best possible reflection of what the pilots need and best possible reflection of current NWP capabilities).

© Crown copyright Met Office Risk, Avoidance and Verification If we forecast an icing risk, presumably the area is avoided by aviation. [For discussion] Hence less likely aircraft will encounter the threat and report it hence validating the forecast. Also less likely to fly through it and say “no there was nothing there, you were wrong”.

© Crown copyright Met Office Risk, Avoidance and Verification If we forecast an icing risk, presumably the area is avoided by aviation. [For discussion] Hence less likely aircraft will encounter the threat and report it hence validating the forecast. Also less likely to fly through it and say “no there was nothing there, you were wrong”. So, if we were over-predicting icing risk. Would we know? Creates a conundrum for verification. How do you calculate a skill score for something that you predict is a danger, if the danger is then avoided and hence not encountered as often as it might be. What can statisticians tell us about how to verify this kind of scenario (there must be analogue in other fields e.g. medicine/epidemics, aviation turbulence, tornados, weather-related road accidents...)

© Crown copyright Met Office When is one icing prediction method “better” than another? Imagine we have an existing icing-risk prediction system. Now we develop a new method of predicting icing risk and we want to make it operational. For this to happen, need to show that the new method is “better”. So, which one is “better”? Clearly, the “better” one has smaller errors! But...

© Crown copyright Met Office There are different TYPES of icing prediction errors.

© Crown copyright Met Office There are different TYPES of icing prediction errors. Or “severity”. This is strongly related to aircraft type, de-icing technology, fuselage temperature (or flight history). These effects can be called “aeronautical” rather than “meteorological”.

© Crown copyright Met Office There are different TYPES of icing prediction errors. This is strongly affected by errors in the prediction of the large-scale weather pattern. So linked to errors in NWP model. Changes to icing prediction method, unlikely to lead to improvement in icing index if front is in the wrong place or arrives earlier than predicted. So is comparing “hit-rate” or “ETS” that helpful when developing ideas for a new icing prediction?

© Crown copyright Met Office There are different TYPES of icing prediction errors. This will reflect biases in the way the icing index is formulated, what inputs it uses and the science that it is built upon. One way of assessing biases in the frequency of occurrence is to look at climatologies. So (while testing new ideas) don’t assess icing prediction in terms of “did you predict icing at 8Z over IAD on 26 Feb 2015”. Instead, given a long-rerun of an NWP model (quite cheap to do really) how frequently do different icing indices predict icing over IAD in each month. Compare that to long term statistics.

© Crown copyright Met Office There are different TYPES of icing prediction errors. Any icing-risk prediction will have some errors associated with it and they will be a COMBINATION of all 3 of these types. Useful to assess the different types of errors separately when developing potential new icing products.

© Crown copyright Met Office Making progress. 1.We would welcome increased dialogue between interested parties. 2.We feel evaluating frequency of occurrence is a useful and important first step, while new indices are “tuned”. 3.Case studies and skill scores still useful, but AFTER (2). 4.We have ideas for how to improve the current WAFC-London icing algorithm, but... 5.We would benefit from an observational data-set to use to validate our new ideas... 6.Does anyone have such a data-set? 7.Ideally something quality-controlled, peer-reviewed and accepted by the community as a fair record of observational occurrence of icing.

© Crown copyright Met Office Questions