Download presentation
Presentation is loading. Please wait.
Published byLora Cross Modified over 8 years ago
1
World Meteorological Organization Working together in weather, climate and water WMO OMM WMO www.wmo.int A New Functionality for Instrument Assessment in OSCAR/Space CGMS-43 WG III, Item III/3 Jérôme Lafeuille (WMO Space Programme)
2
Talking points 1.Status of OSCAR 2.Instrument assessment in OSCAR/Space 3.The «Expert System approach» 4.Expected benefits and next steps CGMS-43, Boulder, CO, May 2015 2
3
Observing System Capability Analysis and Review tool (www.wmo.int/oscar) OSCAR is a core element of WIGOS Includes 3 modules – OSCAR/Requirements – OSCAR/Space – OSCAR/Surface (being developed by Meteo-Swiss) OSCAR/Space has a large audience – 800-1000 visits per day – Users from many countries – Space agencies, research, application and training centres – Reference for reports, application planning, gap analysis, socio-economic benefit studies, frequency management, etc. CGMS-43, Boulder, CO, May 2015 3
4
Recent developments (being implemented) Records individual instruments (not only generic instruments) Instrument timeline takes into account commissioning, status, failure … More precise gap/availability analysis chart (month) Link to «landing pages» with calibration info as agreed within GSICS CGMS-43, Boulder, CO, May 2015 4
5
Talking points 1.Status of OSCAR 2.Instrument assessment in OSCAR/Space 3.The «Expert System approach» 4.Expected benefits and next steps CGMS-43, Boulder, CO, May 2015 5
6
Two kinds of information in OSCAR/Space Factual information: – > 600 satellites – > 800 instruments (including ~ 260 for space weather) – Regularly updated based on input from space agencies including the reports to CGMS and ET-SAT Expert assessments: – Mapping of instruments to variables with degree of relevance (rated 1 to 5) – Mapping with WMO-defined target capabilities (rated 1 to 5) CGMS-43, Boulder, CO, May 2015 6
7
Mapping instruments to variables From a given instrument: which variables can be derived? For a given variable: Which instruments can be used? What performance can be expected ? This is the basis of the gap analysis The instrument-variable mapping can be a complex issue and requires expert assessment Instrument-variable is not a one-to-one relationship Various degrees of relevance Different users have different criteria (depending on application requirements) Users and providers may have different views 7
8
CGMS-43, Boulder, CO, May 2015 Instrument view 8
9
CGMS-43, Boulder, CO, May 2015 9
10
Measurement Timeline for Solar EUV flux 10
11
Current Instrument-variable mapping principle Classes of instruments with similar design features e.g: Spectral bands Bandwidth No of channels Polarization Etc.. CGMS-43, Boulder, CO, May 2015 11 XX X XX X XX X
12
Current Instrument-variable mapping principle Classes of instruments with similar design features e.g: Spectral bands Bandwidth No of channels Polarization Etc.. CGMS-43, Boulder, CO, May 2015 XX X XX X XX X 12 Variable 1 Variable 2 Variable 3 Variable 4 Variable 5 Variable x Variable y
13
Talking points 1.OSCAR status 2.Instrument assessment in OSCAR/Space 3.The «Expert System approach» 4.Expected benefits and next steps CGMS-43, Boulder, CO, May 2015 13
14
New instrument-variable mapping principle Instrument design requirements e.g: Spectral bands Bandwidth No of channels Polarization Scanning mode Etc.. CGMS-43, Boulder, CO, May 2015 XX X XX X XX X Variable 3 Variable x 14 Variable 2 Variable 4 Variable y Variable 1 XX X XX X XX X Rules Technical assessment
15
Instruments are clustered in classes assuming similar design and performances Each class is linked «manually» to variables, based on implicit scientific rationale Heavy to manage when high number of classes Excellent results for Earth Observation instruments ( ̴600 instruments, ̴200 classes) Unpractical for Space Weather instruments (too diverse). No assumption made about similar instrument design Each instrument characterized by fully objective features Performance assessment based on explicit expert rules Transparency: the rules can be submitted to external reviews Facilitates scientific maintenance: rules are independent of the software itself No limitation in number or diversity of instruments CGMS-43, Boulder, CO, May 201515 Current Class-based approach New expert system approach
16
Examples of rules For this Variable With this type of instrument If the following conditions are trueThen the relevance is Sea Surface Temperature Microwave Radiometer >=2 two-polarisations frequencies in 4-8 GHz >=1 multi-polarisation frequency in 8-12 GHz Very good Atmospheric temperature Radio- occultation Receiver for >=3 GNSS constellations has 2 antennas (fore/aft) Very good Aerosol column burden Moderate resolution optical imager >=3 channels in VIS and NIR >=2 channels in the 750 nm band multi-viewing (>=8 angles) multi-polarisation capability Excellent Solar wind velocity Particle detector Detects protons, in 0-10 keV, Over 2π solid angle, sun pointing Energy spectral resolution <10% Angular resolution <0.2π sr Time resolution <10 s Excellent CGMS-43, Boulder, CO, May 2015 16
17
Proof of concept A validation tool has been developed in EXCEL in order to: Demonstrate the method Refine the specification of the operational version to be implemented in OSCAR The knowledge basis was developed and validated The assessment of all EO instruments was translated into ~ 1800 expert rules referring to up to 30 properties for each instrument type ~ 600 EO instruments in OSCAR have been assessed against these properties Promising results No show-stopper was identified Results comparable or better than in current OSCAR Rules can be tuned for further improvement The development of the operational version has started CGMS-43, Boulder, CO, May 2015 17
18
Talking points 1.OSCAR status 2.Instrument assessment in OSCAR/Space 3.The «Expert System approach» 4.Expected benefits and next steps CGMS-43, Boulder, CO, May 2015 18
19
Expected benefits Improved relevance assessment – Noticeable on EO sensors, definitive for space weather sensors Will enable engaging expert groups to review the rules related to their fields of competence : collaborative resource – Potential improvement, better understanding and shared ownership Increase the value and reliability of OSCAR/Space as reference tool for RRR and to support studies and applications Enables «interactive capability exploration» – Search function based on instrument properties Possible training tool – working on the rules for a variable, virtual instruments… CGMS-43, Boulder, CO, May 2015 19
20
Next steps 1.Scientific aspects 1.Develop rules and characterize sensors for space weather 2.Software aspects 1.Implement this new scheme in the operational OSCAR 2.Develop user interface for Interactive Capability Exploration 3.Develop editor’s interface for rules and sensor characteristics 4.Beta-testing version (OSCAR-2.0) planned last Quarter 2015 5.Baseline for possible migration to Meteo-Swiss 3.User uptake 1.Promote the beta-testing version to science groups 2.Investigate partnership with training entity (Vlab, COMET ?) 3.Define policy for use (e.g. public/restricted access to the rules) Expected to be a useful tool to support CGMS activities CGMS-43, Boulder, CO, May 2015 20
21
Thank you for your attention! Please visit: www.wmo.int/oscar Your feedback is welcome jlafeuille@wmo.int www.wmo.int CGMS-43, Boulder, CO, May 201521
22
CGMS-43, Boulder, CO, May 2015 22
23
CGMS-43, Boulder, CO, May 2015 Satellite view (e.g. GOES-R) Satellite view 23
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.