Validation of PGE 10 Automatic Satellite Image Interpretation

Slides:



Advertisements
Similar presentations
© University of Reading 2007www.reading.ac.uk Sting Jets in severe Northern European Windstoms Suzanne Gray, Oscar Martinez-Alvarado, Laura Baker (Univ.
Advertisements

SAFNWC/GEO package: An overview
Precipitation development; Warm and Cold clouds >0 ° C
ASII-NG: Developments and outlook NWCSAF 2015 Users Workshop.
1 Conceptual Model: Rapid Cyclogenesis How to use MSG satellite images similarities to and improvements over MTP Contact person: Veronika Zwatz-Meise
Conceptual Models of Cold Fronts: Anacoldfront Katacoldfront.
1 Conceptual Models (CMs): Cold fronts (CF), Ana- and Katatype How to use MSG satellite images similarities to and improvements over MTP Contact person:
1 Conceptual Models (CMs): Cold Air Cloudiness (CAC), Enhanced Cumulus (EC ) and Comma How to use MSG satellite images similarities to and improvements.
Conceptual Models of Cold Air Features: Comma. Cloud Structures in Satellite Images.
SATREP Online Jarno Schipper – Zentralanstalt für Meteorologie und Geodynamik, Austria. Vesa Nietosvaara – Finnish Met Institute, Finland.
Conceptual Model of: Orographic Cloudiness: Lee cloudiness Lee waves High lee cloudiness.
PERFORMANCE OF THE H-E ALGORITHM DURING THE CENTRAL AMERICAN RAINY SEASON OF 2001.
1 Department of Agriculture Animal Production and Aquatic Environment 2 Department of Management of Environment and Natural Resources University of Thessaly.
LMD/IPSL 1 Ahmedabad Megha-Tropique Meeting October 2005 Combination of MSG and TRMM for precipitation estimation over Africa (AMMA project experience)
Version 1.0, 20 April 2005 Slide: 1 APPLICATIONS OF METEOSAT SECOND GENERATION (MSG) RGB IMAGES: PART 04 RGB COMPOSITES WITH CHANNELS AND THEIR INTERPRETATION.
Blended Course on the Principles of Satellite Meteorology 21 st April – 15 th July 2009 Classroom June 2009 Final Presentation – Convection – 09May2009.
1 Conceptual Model (CM): Comma (different types) How to use MSG satellite images similarities to and improvements over MTP Contact person: Veronika Zwatz-Meise.
Materials Adapted from The University of Georgia
1 Conceptual Models (CMs): Stau (Alps) Lee waves Snow How to use MSG satellite images similarities to and improvements over MTP Contact person: Veronika.
The Rapid Developing Thunderstorm (RDT) product CDOP to CDOP2
Zentralanstalt für Meteorologie und Geodynamik Introduction to Conceptual Models Veronika Zwatz-Meise.
Cloud Mask: Results, Frequency, Bit Mapping, and Validation UW Cloud Mask Working Group.
1 Conceptual Model: Rapid Cyclogenesis How to use MSG satellite images similarities to and improvements over MTP Contact person: Veronika Zwatz-Meise
Observed & Simulated Profiles of Cloud Occurrence by Atmospheric State A Comparison of Observed Profiles of Cloud Occurrence with Multiscale Modeling Framework.
1 Conceptual Models (CMs): Cold Air Cloudiness (CAC), Enhanced Cumulus (EC ) and Comma How to use MSG satellite images similarities to and improvements.
FACE DETECTION : AMIT BHAMARE. WHAT IS FACE DETECTION ? Face detection is computer based technology which detect the face in digital image. Trivial task.
R = Channel 02 (VIS0.8) G = Channel 04r (IR3.9, solar component) B = Channel 09 (IR10.8) Day Microphysics RGB devised by: D. Rosenfeld Applications: Applications:Cloud.
1 Validation for CRR (PGE05) NWC SAF PAR Workshop October 2005 Madrid, Spain A. Rodríguez.
Validation and tuning of PGE 12 – Air Mass Analysis Alexander Jann ZAMG Vienna, AUSTRIA PAR Workshop, 17 – 19 October 2005, Madrid.
1 PGE04-MSG Precipitating Clouds Product Presented during the NWCSAF Product Assessment Review Workshop October 2005 Prepared by : Anke Thoss, Anna.
Validation of Cloud Top Temperature with CLOUDNET cloud radar data during the BBC2 Period Erwin Wolters (D. Donovan Presenting)
May 15, 2002MURI Hyperspectral Workshop1 Cloud and Aerosol Products From GIFTS/IOMI Gary Jedlovec and Sundar Christopher NASA Global Hydrology and Climate.
SIGMA: Diagnosis and Nowcasting of In-flight Icing – Improving Aircrew Awareness Through FLYSAFE Christine Le Bot Agathe Drouin Christian Pagé.
1 Preliminary Validation of the GOES-R Rainfall Rate Algorithm(s) over Guam and Hawaii 30 June 2016 Presented By: Bob Kuligowski NOAA/NESDIS/STAR.
(Winter 2017) Instructor: Craig Duckett
Changes in the Melt Season and the Declining Arctic Sea Ice
Best practices for RGB compositing of multi-spectral imagery
Warm Fronts Mixed Phase Case.
INTERPRETATION OF LARGE SCALE CUMULUS CLOUD PATTERNS
Observation of sub-mesoscale eddies over Baltic Sea using TerraSAR-X and Oceanographic data Aikaterini Tavri (1), Suman Singha (2), Susanne Lehner (3),
4.2 Weather Patterns Pages
Meteorological charts
What are the causes of GCM biases in cloud, aerosol, and radiative properties over the Southern Ocean? How can the representation of different processes.
Using satellite imagery to detect severe thunderstorms
Tony Wimmers, Wayne Feltz
Mesoscale Convective Systems
Cyclogenesis in Polar Airstreams
MID-LATITUDE CYCLONES
Why Do We Have Weather?.
The winters are long and the summers, while warm, are fairly short
Comparison of AATSR SST with other sensors
Conceptual Models (CMs): Wave - Upper wave
Statistical vs. Physical Adaptation
Air Masses - Jet Stream - Pressure Systems
Cliff Mass, Jeff Baars, and Lee Picard, University of Washington
AS LEVEL Paper One – Section A / B
The use of SAFNWC products at Instituto de Meteorologia (IM), Portugal
Analysing your pat data
Air Masses - Jet Stream - Pressure Systems
Conceptual Models (CMs): Stau (Alps) Lee waves Snow
Validation for TPW (PGE06)
Generation of Simulated GIFTS Datasets
Cyclones: Horizontal Structure and Evolution
Water Vapour Imagery and
APPLICATIONS OF METEOSAT SECOND GENERATION (MSG)
VERIFICATION OF THE LAMI AT CNMCA
APPLICATIONS OF METEOSAT SECOND GENERATION (MSG)
Conceptual Models (CMs): Wave - Upper wave
Presentation transcript:

Validation of PGE 10 Automatic Satellite Image Interpretation Andreas Wirth ZAMG Vienna, AUSTRIA PAR Workshop, 17 – 19 October 2005, Madrid

Synopsis Validation Dataset Validation Rules Evaluation of ASII Evaluation of ASIINWP

Evaluation of PGE10: Validation Dataset Evaluation phase from 16th February to 30th September 2005 - winter period (16th Feb. – 15th April 2005) - summer period (18th April – 30th September 2005) Separate evaluation of ASII and ASIINWP Evaluation of PGE10 against the SATREP twice a day (06 UTC and 12 UTC)

Validation Rules The manually generated SATREP served as reference to both PGE10 outputs. Conceptual Models from the SATREP were compared to analysed areas in PGE10.

Comparison PGE10 with SATREP

Evaluation of PGE10 The four ratings used to describe the quality of PGE10 output: Rating 1 2 3 4 Meaning correct partly correct not correct Hardly/not analysed Percentage correctly analysed   > 66% 50 – 66% 0 – 50%   An additional quantification of the occurrence of incorrect CM analyses was done with the following adjectives:   Adjectives Mostly Many Some Few mixed Percentage > 66% 33 – 66% 5 – 33% < 5% 2 CM with 50%

Re-grouping of frontal categories   Re-grouping of frontal categories PGE10 analysis SATREP analysis Cold Front Classical CF CF in CA CF in WA Arctic CF Split Front BB Occlusion CAD   Warm Front WF band WF shield Detached WF Occlusion Classical Occlusion Occl. CCB Occl. 2nd Low Instant Occl. Table 2: Assignment of frontal SATREP categories to PGE10 classifications

ASII: Overall performance All WF: 50.5% All CF: 49% MCS: 46.5% EC: 40.5% CB Cluster: 40% All Occl.: 39.5% ECAC: 37.5% Comma: 27% Frontal Waves: 25.5% Lee Clouds: 21% CCC: 16% CAC: 15% Jet Fibres: 9.5%

Evaluation of ASII: Frontal Categories Cold fronts: - Almost 50% of all CF from SATREP were detected - 10% of the SATREP CF were only ½ to 2/3 detected - Slightly better performance in the summer period - 15% of the SATREP CF are not detected in ASII from IR image (warm cloud tops) - CF in CA performs better than average (58%)

Evaluation of ASII: Frontal Categories Warm fronts: - 50% of the SATREP WF are analysed as WF in ASII - no seasonal or daily trend - WF shields are better detected than WF bands or detached WF. - 5.5% of the SATREP WF are not detected in ASII - Misclassified WF are mostly analysed as CF due to their band like structure

Evaluation of ASII: Frontal Categories Occlusions: - Detection rate of 40% for all Occlusion types - Weak seasonal trend: better detection rate in the summer period - As for the CF, 10% of the SATREP Occl. are at least detected half but not fully - The most common misclassification is CF with 15% of all cases

Evaluation of ASII: Frontal Waves - The overall performance of frontal waves lies at 25% - Waves are better recognised in the summer period than in winter time (better frontal delineation in the summer)

Evaluation of ASII: Enhanced Cumuli Enhanced Cumuli (EC): - ASII detects about 40% of the SATREP EC - Slightly better performance in the summer period - Approx. 41% of the SATREP EC are misclassified - in 10% Comma is analysed - in 3% CB Cluster - in 5% cold air cloudiness - in 17% the region is considered as frontal - 14% of the SATREP EC are not detected in ASII

Evaluation of ASIINWP: General Remarks With the inclusion of NWP data (ECMWF), the PGE10 output is slightly modified although the algorithm based on satellite data is very similar. Some CM cannot be retrieved without NWP data: - the Upper Level Low - FI by Jet - the Upper Wave

ASIINWP: Overall performance All CF: 58.5% Upper Waves: 17.5% All WF: 47.5% CAC, CCC: 15.5% FI by Jet: 47.5% Frontal Waves: 12% MCS: 45.5% Jet Fibre: 8.5% ULL: 44.5% OCC: 7% All Occl.: 40.5% ECAC: 38.5% CB Cluster: 37% EC: 30% Comma: 26.5% Lee Clouds: 21.5%

Evaluation of ASIINWP: Upper Level Low - Detection rate of 44.5% - Slightly better performance in the winter period - 10% of the proposed analyses are Occlusions - In 13.5% of SATREP ULL cases, either comma, EC or cold air cloudiness is analysed

Evaluation of ASIINWP: the Upper Wave Rare phenomenon in SATREP - Detection rate of 18%, 8% of the SATREP Upper Waves are at least half but not fully recognised - If ASIINWP does not analyse UW, the frontal tag is displayed (CF, WF or Occl.)

Evaluation of ASIINWP: CB Cluster, MCS CB Cluster and Mesoscale Convective Systems: - Detection rate of 45.5% for MCS and 37% for CB Cluster - Slightly better performance in the summer period for MCS, strong increase in performance for CB - No daily trend for CB and MCS detection - Interaction with similar CM (CB, MCS, EC, cold air cloudiness)

Final Remarks An evaluation „the other way round“ has not been made. It would have been useful to analyse the synoptic relevance of features detected in PGE10 but not in the manually generated SATREP. The Dry Intrusion (DI) has not been validated since there is no corresponding CM in the SATREP.