OptiFrame WP6: Overview, status and contents

Slides:



Advertisements
Similar presentations
SIP/2012/ASBU/Nairobi-WP/19
Advertisements

VARTAN – Validation Reporting Templates Jürgen Teutsch, NLR CAATS Workshop, 16-Feb-2006, Lanzarote.
October 31, Metron Aviation, Inc. Dan Rosman Assessing System Impacts: Miles-in-Trail and Ground Delays.
FAA/Eurocontrol TIM 9 on Performance Metrics – INTEGRA Rod Gingell 16 May 2002.
Expert Groups in Episode 3 Episode 3 - CAATS II Final Dissemination Event Raquel Garcia Isdefe Episode 3 Brussels, 13 & 14 Oct 2009.
Episode 3 / CAATS II joint dissemination event Gaming Techniques Episode 3 - CAATS II Final Dissemination Event Patricia López Aena Episode 3 Brussels,
The Performance Framework Episode 3 - CAATS II Final Dissemination Event Laurent Tabernier EUROCONTROL Project Episode 3 Brussels, 13 & 14 Oct 2009.
International Civil Aviation Organization Trajectory-Based Operations(TBO) Saulo Da Silva SIP/ASBU/Bangkok/2012-WP/25 Workshop on preparations for ANConf/12.
Episode 3 Operational Concept Detailing Episode 3 - CAATS II Final Dissemination Event Ros Eveleigh & Eliana Haugg EUROCONTROL & DFS Episode 3 Brussels,
Episode 3 1 Episode 3 EX-COM D Final Report and Recommendations Operational and Processes Feasibility Pablo Sánchez-Escalonilla CNS/ATM Simulation.
Continuous Climb Operations (CCO) Saulo Da Silva
© 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang 4-1 Chapter 4 Modeling and Analysis Turban,
International Civil Aviation Organization Collaborative Decision Making (CDM) Saulo Da Silva Workshop on preparations for ANConf/12 − ASBU methodology.
1 Practical use of six-sigma for the IATA Fuel Program Tom Fodor Assistant Director E&M.
BUSINESS CASE Episode 3 - CAATS II Final Dissemination Event Ignacio Zozaya Boeing Research & Technology Europe CAATS II Brussels, 13 & 14 Oct 2009.
Study Continuous Climb Operations
SESAR Single European Sky Air traffic management Research
Frank Brenner Director General, EUROCONTROL European ATM Israel Civil Aviation Conference, 18th November 2013.
ASSTAR User Forum #1 Rome 4th April 2006 ASAS-TN2 Second Workshop ASSTAR Safety Approach and Preliminary Issues Dr Giuseppe GRANIERO, SICTA
Distributed Multi-Nodal ATFM Operational Trial
1 ATM System Wide Modeling Capabilities in Fast-Time Simulation 1 st Annual Workshop – NAS-Wide Simulation in Support of NextGen Dec. 10th – George Mason.
ASAS Crossing and Passing Applications in Radar Airspace (operational concept and operational procedure) Jean-Marc Loscos, Bernard Hasquenoph, Claude Chamayou.
Programme Status ECTL AAB February FACTS  A mature approach: 2500 contributors Release process organises the delivery cycle ATM Engineering:
1 EUROCONTROL S TRATEGIES FOR The ATM Strategy for the Years As from MATSE/6 decision (Jan. 2000): To cater for forecast increase in demand.
2050AP Project WP5: “Conclusions” UPM Madrid 11 de Octubre 2013.
Federal Aviation Administration Integrated Arrival/Departure Flow Service “ Big Airspace” Presented to: TFM Research Board Presented by: Cynthia Morris.
Jon Hadler and Dr. David Ison
Airport Collaborative Decision Making (A-CDM) Saulo Da Silva
ASSTAR Overview Jean-Marc Loscos, DSNA
1. KPAs, KPIs, and metrics Performance measurement in a world of targets and trade-offs.
Collaborative Decision Making (CDM) Saulo Da Silva
COmbining Probable TRAjectories — COPTRA
Collaborative Decision Making Module 5 “The Collaborative Environment”
Users’ Perspective of PBN Achieving Improvements
FF-ICE A CONCEPT TO SUPPORT THE ATM SYSTEM OF THE FUTURE
The Role of Air Traffic Control in A CDM ENVIRONMENT
ASBU Methodology Summary of Block 1 Modules Saulo Da Silva
SIP/2012/ASBU/Nairobi-WP/19
Presented by United Arab Emirates
NON-CONFORMITIES AND EXEMPTIONS AERONAUTICAL STUDIES
Workshop on preparations for ANConf/12 − ASBU methodology
Introduction of TBO within OptiFrame
Continuous Climb Operations (CCO) Saulo Da Silva
M&E Task Force Meeting
OPTIFRAME : Project Overview
SIP/2012/ASBU/Nairobi-WP/7
OptiFrame WP7: Implications for Decision Makers and Dissemination of results LANCASTER UNIVERSITY Centre for Transportation Systems and Logistics (CENTRAL)
Blocks 2 & 3 Overview Samuli Vuokila Air Navigation Commissioner
Work Package 3 Data Management
OptiFrame WP2: State-of-the-art and Stakeholders Expectations
OptiFrame WP1: Project Management
Airport Collaborative Decision Making (A-CDM) Saulo Da Silva
ASSTAR Project Overview & User Forum Objectives
Continuous Climb Operations (CCO) Saulo Da Silva
Trajectory-Based Operations(TBO) Saulo Da Silva
Nikola Ivanov, Fedja Netjasov, Radosav Jovanovic
Complex World Event 8 April 2015 EUROPEAN COMMISSION.
Communications Operating Concept & Requirements (COCR)
Workshop on preparations for ANConf/12 − ASBU methodology
WG ESA meeting 9th of March 2015
Traffic Management and FUA integration
ASBU Methodology Summary of Block 1 Modules Saulo Da Silva
RAD Evolution Workshop
RAD Evolution Workshop
Soren Dissing - EUROCONTROL
Collaborative Decision Making “Developing A Collaborative Framework”
Planning Reserve Margin
Continuous Climb Operations ( CCO )
A Concept for Launch and Reentry Collaborative Decision Making (CDM)
Operation Context and Use Case Focus Group
Presentation transcript:

OptiFrame WP6: Overview, status and contents Pim van Leeuwen NLR Midterm Review Meeting 24 March 2017, Lancaster University

WP6: Work package overview [Insert name of the presentation]

WP6. Validation of the OptiFrame approach in normal and disturbance cases Objectives Identify TBO disturbances and develop affected operational scenarios as a basis for model development and validation test cases. Assess the operational implications of the OptiFrame models for normal situations and disturbance cases. Assess computed ATFCM solutions for relevant KPIs in normal and disturbance cases. Period T0+2 (June 2016) – T0+23 (Feb 2018)

WP6 Tasks Task 6.1: Identification of disturbances and operational scenarios of TBO D12 (report): Disturbances and operational scenarios for TBO (Jan 2017) Partners: NLR (lead), ECTR Task 6.2: Qualitative assessment of OptiFrame models in normal and disturbance cases D13 (report): Qualitative assessment of OptiFrame models in normal and disturbance cases (30 JUN 2017) Partners: NLR (lead), ULANC Task 6.3: Assessment of the OptiFrame computational framework in normal and disturbance cases D14 (report): Assessment of OptiFrame framework in normal and disturbance cases (31 JAN 2017) Partners: NLR (lead), CFR

WP6. GANTT Chart OptiFrame telecon, 23 November 2016

WP6: Interaction with WP3, 4 & WP5 D 3.1: DMP (T0+12) WP4 D 4.1: Mathematical model (T0+12) WP5 D5.1: Heuristic algorithm for TBO Model (T0 + 16). D5.2: OptiFrame Computational Framework (T0 + 20). WP6 D6.2/D13: Qualitative assessment of OptiFrame models (T0 + 16). D6.3/D14: Detailed assessment of the OptiFrame computational framework (T0 + 23). [Insert name of the presentation]

WP6: Status [Insert name of the presentation]

Task 6.1: Identification of disturbances and operational scenarios of TBO Project scheduling Period: 1 JUN 2016 – 31 JAN 2017 D12 Disturbances and operational scenarios for TBO: 29 JAN 2017 Partners: NLR (lead), ECTRL Steps undertaken: Identify and structure disturbances that can affect TBO Identify list of KPA/KPIs Identify operational scenarios Receive stakeholders feedback: 5 OCT 2016 workshop First draft produced (20 DEC 2016) and reviewed by consortium Second draft produced (18 JAN 2017) and reviewed by consortium Submission (29 JAN 2017)       

Task 6.2: Qualitative assessment of OptiFrame models Project scheduling Period: 1 DEC 2016 – 30 JUN 2017 D6.2/D13 (report): Qualitative assessment of OptiFrame models in normal and disturbance cases (T0+16): 30 JUN 2017 Partners: NLR (lead), ULANC Steps: Develop initial ideas on qualitative assessment approach First draft produced: 24 FEB 2017 First draft reviewed by consortium: 10 MAR 2017 Review processed and description final model incorporated Second draft produced (5 JUN 2016) Second draft reviewed by consortium (23 JUN 2017) Submission (30 JUN 2017)     (partly)

Task 6.3: Assessment of the OptiFrame computational framework Project scheduling Period: 1 APR 2017 – 31 JAN 2018 D14 (report): Quantitative Assessment of the KPI performance of the computational framework (T0+23): 31 JAN 2018 Partners: NLR (lead), CFR Steps: Develop initial ideas on quantitative assessment approach (started) Draft performance assessment plan produced: 1 APR 2017 – 30 JUN 2017 Plan reviewed by consortium: 14 JUL 2017 Execution and production draft Performance Assessment report : 15 JUL 2017 – 30 NOV 2017 Performance assessment report reviewed by consortium (31 DEC 2017) Review comments incorporated (25 JAN 2018) Submission plan and results report as D14 (31 JAN 2018) Note: this task has not started yet

Contents: Task 6.1 [Insert name of the presentation]

T6.1: Validation Approach D12: General Performance Assessment Approach Approach: Specify validation objectives Define the scope: KPAs and KPIs Identify disturbance categories to the nominal situation Define reference and solution scenarios Refine scenarios for qualitative assessment Perform qualitative assessment (reported on in D13) Refine scenarios for quantitative assessment Perform quantitative assessment (reported on in D14) Step 2: see SESAR Performance Assessment Methodology for Step1, Description of the Methodology for Step 1 Performance Assessment, 08 April 2011, Edition 00.01.00, Project ID B.05.0 p. 13: “An Influence Diagram can be defined as a graphical representation of a probabilistic network for reasoning about decision making under conditions of uncertainty (see Figure 6, taken from SESAR D4 [5]). Influence Diagrams only show relationships between factors, while Influence Models contain formulas to compute impacts.” Three types of scenarios: 1) Reference scenario: without the solution. Could be: - CASA, or: - TBO without OptiFrame   2) Solution scenario without disturbances: nominal scenario 3) Solution scenario with disturbances: disturbed scenario.

General Assessment Approach (steps) D 4.1: Mathe-matical model WP3/4/5 D5.1: Heuristic algorithm for TBO (T0 + 16). D5.2: OptiFrame Computational Framework (T0 + 20). WP6 1. SPECIFY VALIDATION OBJECTIVES 2. DEFINE THE SCOPE: KPAs and KPIs 3. IDENTIFY DISTURBANCE CATEGORIES 4. DEFINE REFERENCE and SOLUTION SCENARIOS 6. PERFORM QUALITATIVE ASSESSMENT (T0 + 16) 8. PERFORM QUANTITATIVE ASSESSMENT (T0 + 23) 7. REFINE SCENARIOS FOR QUANTITATIVE ASSESSMENT D 3.1: DMP 5. REFINE SCENARIOS FOR QUALITATIVE ASSESSMENT \notes\; This relates to V0, not V1-V3 ((E-OCVM) Assessment provides feedback for further (industrial) research. Assessment should be seen as a initial study, no proof of concept. Many assumptions apply. [Insert name of the presentation]

1. Specify Validation Objectives Compliance: Assess compliance of OptiFrame solutions to all applicable constraints (capacity constraints of sectors/airports, separation constraints, etc.)? Assess OptiFrame support to flight prioritisation (Fleet Delay Apportionment)? Assess OptiFrame support to route preferences (e.g. extent to which OptiFrame solutions deviate from the user’s preferred routes)? Performance: Assess whether solutions can be generated within reasonable computation time Scalability: Assess solutions generation for a future day of ops with 25% increase in traffic Assess, for scientific purposed only, whether and to what level of traffic such solutions can be generated KPA/KPI Impact: Assess the impact of the system on relevant Key Performance Areas Resilience: Assess the resilience of the system to a representative set of typical disturbances (laid down in operational scenarios) and determine the impact of these disturbances on relevant KPAs Compliance: Do OptiFrame solutions comply with all applicable constraints (i.e., capacity constraints of sectors/airports, separation constraints, etc.)? Do they support user prioritisation (i.e., Flight Delay Apportionment)? Do they support user route preferences (i.e., in how far do OptiFrame solutions deviate from the user’s preferred routes)? Performance: Assess whether solutions can be generated within reasonable computation time Scalability: Assess solutions for the ECAC-wide area and for an entire day of operations Assess solutions generation for a future day of ops with 25% increase in traffic Assess, for scientific purposed only, whether such solutions can also be generated for a future day of operations with a 100% increase in traffic KPA/KPI Impact: Assess the impact of the OptiFrame solutions on all relevant Key Performance Areas by evaluating the impact on the corresponding KPIs Resilience: Assess the resilience of OptiFrame solutions to a representative set of typical disturbances (laid down in operational scenarios) and determine the impact of these disturbances on relevant KPA/KPIs This 25% increase follows from the feedback received on the 5 October stakeholder workshop, where partic ipants indicated that they considered this increase realistic (and larger increased unrealistic). During the 5 October stakeholder workshop it was suggested by participants that a 100% increase of traffic might be used for scientific purposed (not reflecting a realistic scenario, but useful to determine the upper performance bound of the OptiFrame solutions). Resilience: see seciton 4.2.3 in D7.

2. Define the scope: KPAs and KPIs 5 Oct workshop: ”Expressed views on the relevance of KPAs in relation to TBO indicate that costs are most important for airspace users. Other relevant KPAs include equity, fuel efficiency, capacity, predictability, punctuality, and flexibility.” KPAs and example KPIs: Fuel Efficiency (average fuel burn per phase of flight) Capacity (airspace movements per hour or airport DEP/ARR per hour) Predictability & punctuality (DEP/ARR close to scheduled times of flight) Flexibility (ability of AUs to modify flight trajectories) Cost-Effectiveness (e.g. fuel costs, time costs, ATC-costs, aircraft costs, pax delay costs – as suggested at the workshop) Equity and Fairness (equal access to airspace or services, without benefiting any actors over others) KPA Safety is defined as out of scope. Justification is that (1) safety in terms of numbers of incidents/accidents will be difficult to assess and (2) safety could be considered to be guaranteed when all applicable constraints (objective 1) are complied with.

3. Identify Disturbance Categories Main disturbance categories affecting 4D trajectories under TBO: Uncertainty in weather data (wind, temperature, weather events) leading to uncertainty in ground and airborne predictions Uncertainty in the turn-around process due to irregularities with security, passengers, fuelling, aircraft maintenance, baggage handling, and others Uncertainty in aircraft performance for ground-based operational actors due to variations in pilots’ aircraft handling, airline policies, and unknown aircraft characteristics Uncertainties Uncertainty due to interactions between flights, e.g. their sharing of limited resources like airspace, runways, and ATC capacity Uncertainty due to incomplete synchronisation of information between stakeholders (SWIM) Uncertainty in weather data (wind, temperature, weather events) leading to uncertainty in ground and airborne predictions; Uncertainty in the turn-around process due to irregularities with security, passengers, fuelling, aircraft maintenance, baggage handling, and others; Uncertainty in aircraft performance for ground-based operational actors due to variations in pilots’ aircraft handling, airline policies, and unknown aircraft characteristics; Uncertainty due to interactions between flights, e.g. their sharing of limited resources like airspace, runways, and ATC capacity; Uncertainty due to incomplete synchronisation of information between stakeholders. To deal with changing circumstances and residual uncertainty planned trajectories need to be continuously refined and revised during the execution phase, based on latest data, observations and predictions, in order to continuously find the optimum balance between potentially conflicting demands from different perspectives of the flights (pilots), airlines, airports, flow management, and ATC [26]. Such refinements and revisions may be of three kinds: - Major deviations from a coordinated trajectory, including re-planning of trajectories and trajectory constraints, due to disturbances such as significant unpredicted weather changes, unplanned runway closure, engine failure, and major pre-departure deviations; - Minor deviations from a coordinated trajectory, including some minor adjustments of aircraft performance; - Minor within-tolerance deviations from a coordinated trajectory, which are within the tolerance level of the trajectory and its constraints, implying that only an updated trajectory prediction needs to be synchronised among the stakeholders.

4. Define Reference and Solution Scenarios 1. “Reference” scenario: Nominal OptiFrame Scenario, scenario without disturbances Two traffic samples: Busy summer day 2016 (current-day traffic) Forecasted busy day in 2022 (+25% traffic compared to 2016) 2. Solution scenarios: Disturbed OptiFrame Scenario, initial pool: Unforeseen wind changes Airspace restrictions Airport restrictions Aircraft turnaround delay Airport closure Aircraft performance variations Insufficient synchronisation of information

5. Refine Operational scenarios for Qualitative Assessment (T6.2) For T6.2 we refine from the pool of scenarios based on: Capabilities of the WP4 model The demands of qualitative assessment (the agent-based modelling and simulation method) Note: to be done in T6.2

6. Qualitative Assessment (T6.2) Knowledge base (incl. workshop results) Qualitative validation TBO planning model KPAs impact Resilience Reference and solution scenarios Process Qualitative reasoning about the operational impact of the OptiFrame TBO planning models on relevant KPAs: Reference/nominal scenario Disturbance scenarios Results Impact on KPAs, insight into model resilience Feedback to planning model development Input for scoping of quantitative validation

7. Refine Operational scenarios for Quantitative Assessment (T6.3) For T6.3 we refine and select from the pool of scenarios based on: 5 Oct workshop feedback Outcomes of the qualitative validation Capabilities of the WP4 model and WP5 implementation Scenario feedback from the workshop (amongst other comments): Scenario Unforeseen wind changes: by some “not considered relevant enough when related to upper airspace” Scenario Airspace restrictions: considered more relevant if related to: Military airspace restricting or opening up for civil airspace Radar failure, sudden closure of an airspace sector Time periods at which routes are open for civil use. Scenario Airport Closure: “(...) is considered by some experts as rather exceptional and might therefore not be so relevant for the purposes of the OptiFrame project Scenario Aircraft performance variations and scenario Insufficient synchronisation of information: ” Scenarios 5 (aircraft performance variations) and 6 (insufficient synchronisation) should be removed according to some stakeholders Note: to be done in T6.3

8. Quantitative Assessment (T6.3) Validation tools Compliance Performance Scalability KPAs impact Resilience Quantitative validation Calculated TBO plans Reference, nominal and selected disturbance scenarios Process Quantitative evaluation of the operational impact of the optimized TBO plans, using suitable validation tools, on selected KPAs: Reference scenario Nominal scenario Selected disturbance scenarios Results Compliance / Performance / Scalability / KPAs impact / Resilience Input for final evaluation of the Optiframe methods Note: to be done in T6.3

QUESTIONS? OptiFrame telecon, 23 November 2016