Download presentation
Presentation is loading. Please wait.
1
OptiFrame WP6: Overview, status and contents
Pim van Leeuwen NLR Midterm Review Meeting 24 March 2017, Lancaster University
2
WP6: Work package overview
[Insert name of the presentation]
3
WP6. Validation of the OptiFrame approach in normal and disturbance cases
Objectives Identify TBO disturbances and develop affected operational scenarios as a basis for model development and validation test cases. Assess the operational implications of the OptiFrame models for normal situations and disturbance cases. Assess computed ATFCM solutions for relevant KPIs in normal and disturbance cases. Period T0+2 (June 2016) – T0+23 (Feb 2018)
4
WP6 Tasks Task 6.1: Identification of disturbances and operational scenarios of TBO D12 (report): Disturbances and operational scenarios for TBO (Jan 2017) Partners: NLR (lead), ECTR Task 6.2: Qualitative assessment of OptiFrame models in normal and disturbance cases D13 (report): Qualitative assessment of OptiFrame models in normal and disturbance cases (30 JUN 2017) Partners: NLR (lead), ULANC Task 6.3: Assessment of the OptiFrame computational framework in normal and disturbance cases D14 (report): Assessment of OptiFrame framework in normal and disturbance cases (31 JAN 2017) Partners: NLR (lead), CFR
5
WP6. GANTT Chart OptiFrame telecon, 23 November 2016
6
WP6: Interaction with WP3, 4 & WP5
D 3.1: DMP (T0+12) WP4 D 4.1: Mathematical model (T0+12) WP5 D5.1: Heuristic algorithm for TBO Model (T0 + 16). D5.2: OptiFrame Computational Framework (T0 + 20). WP6 D6.2/D13: Qualitative assessment of OptiFrame models (T0 + 16). D6.3/D14: Detailed assessment of the OptiFrame computational framework (T0 + 23). [Insert name of the presentation]
7
WP6: Status [Insert name of the presentation]
8
Task 6.1: Identification of disturbances and operational scenarios of TBO
Project scheduling Period: 1 JUN 2016 – 31 JAN 2017 D12 Disturbances and operational scenarios for TBO: 29 JAN 2017 Partners: NLR (lead), ECTRL Steps undertaken: Identify and structure disturbances that can affect TBO Identify list of KPA/KPIs Identify operational scenarios Receive stakeholders feedback: 5 OCT 2016 workshop First draft produced (20 DEC 2016) and reviewed by consortium Second draft produced (18 JAN 2017) and reviewed by consortium Submission (29 JAN 2017)
9
Task 6.2: Qualitative assessment of OptiFrame models
Project scheduling Period: 1 DEC 2016 – 30 JUN 2017 D6.2/D13 (report): Qualitative assessment of OptiFrame models in normal and disturbance cases (T0+16): 30 JUN 2017 Partners: NLR (lead), ULANC Steps: Develop initial ideas on qualitative assessment approach First draft produced: 24 FEB 2017 First draft reviewed by consortium: 10 MAR 2017 Review processed and description final model incorporated Second draft produced (5 JUN 2016) Second draft reviewed by consortium (23 JUN 2017) Submission (30 JUN 2017) (partly)
10
Task 6.3: Assessment of the OptiFrame computational framework
Project scheduling Period: 1 APR 2017 – 31 JAN 2018 D14 (report): Quantitative Assessment of the KPI performance of the computational framework (T0+23): 31 JAN 2018 Partners: NLR (lead), CFR Steps: Develop initial ideas on quantitative assessment approach (started) Draft performance assessment plan produced: 1 APR 2017 – 30 JUN 2017 Plan reviewed by consortium: 14 JUL 2017 Execution and production draft Performance Assessment report : 15 JUL 2017 – 30 NOV 2017 Performance assessment report reviewed by consortium (31 DEC 2017) Review comments incorporated (25 JAN 2018) Submission plan and results report as D14 (31 JAN 2018) Note: this task has not started yet
11
Contents: Task 6.1 [Insert name of the presentation]
12
T6.1: Validation Approach
D12: General Performance Assessment Approach Approach: Specify validation objectives Define the scope: KPAs and KPIs Identify disturbance categories to the nominal situation Define reference and solution scenarios Refine scenarios for qualitative assessment Perform qualitative assessment (reported on in D13) Refine scenarios for quantitative assessment Perform quantitative assessment (reported on in D14) Step 2: see SESAR Performance Assessment Methodology for Step1, Description of the Methodology for Step 1 Performance Assessment, 08 April 2011, Edition , Project ID B.05.0 p. 13: “An Influence Diagram can be defined as a graphical representation of a probabilistic network for reasoning about decision making under conditions of uncertainty (see Figure 6, taken from SESAR D4 [5]). Influence Diagrams only show relationships between factors, while Influence Models contain formulas to compute impacts.” Three types of scenarios: 1) Reference scenario: without the solution. Could be: - CASA, or: - TBO without OptiFrame 2) Solution scenario without disturbances: nominal scenario 3) Solution scenario with disturbances: disturbed scenario.
13
General Assessment Approach (steps)
D 4.1: Mathe-matical model WP3/4/5 D5.1: Heuristic algorithm for TBO (T0 + 16). D5.2: OptiFrame Computational Framework (T0 + 20). WP6 1. SPECIFY VALIDATION OBJECTIVES 2. DEFINE THE SCOPE: KPAs and KPIs 3. IDENTIFY DISTURBANCE CATEGORIES 4. DEFINE REFERENCE and SOLUTION SCENARIOS 6. PERFORM QUALITATIVE ASSESSMENT (T0 + 16) 8. PERFORM QUANTITATIVE ASSESSMENT (T0 + 23) 7. REFINE SCENARIOS FOR QUANTITATIVE ASSESSMENT D 3.1: DMP 5. REFINE SCENARIOS FOR QUALITATIVE ASSESSMENT \notes\; This relates to V0, not V1-V3 ((E-OCVM) Assessment provides feedback for further (industrial) research. Assessment should be seen as a initial study, no proof of concept. Many assumptions apply. [Insert name of the presentation]
14
1. Specify Validation Objectives
Compliance: Assess compliance of OptiFrame solutions to all applicable constraints (capacity constraints of sectors/airports, separation constraints, etc.)? Assess OptiFrame support to flight prioritisation (Fleet Delay Apportionment)? Assess OptiFrame support to route preferences (e.g. extent to which OptiFrame solutions deviate from the user’s preferred routes)? Performance: Assess whether solutions can be generated within reasonable computation time Scalability: Assess solutions generation for a future day of ops with 25% increase in traffic Assess, for scientific purposed only, whether and to what level of traffic such solutions can be generated KPA/KPI Impact: Assess the impact of the system on relevant Key Performance Areas Resilience: Assess the resilience of the system to a representative set of typical disturbances (laid down in operational scenarios) and determine the impact of these disturbances on relevant KPAs Compliance: Do OptiFrame solutions comply with all applicable constraints (i.e., capacity constraints of sectors/airports, separation constraints, etc.)? Do they support user prioritisation (i.e., Flight Delay Apportionment)? Do they support user route preferences (i.e., in how far do OptiFrame solutions deviate from the user’s preferred routes)? Performance: Assess whether solutions can be generated within reasonable computation time Scalability: Assess solutions for the ECAC-wide area and for an entire day of operations Assess solutions generation for a future day of ops with 25% increase in traffic Assess, for scientific purposed only, whether such solutions can also be generated for a future day of operations with a 100% increase in traffic KPA/KPI Impact: Assess the impact of the OptiFrame solutions on all relevant Key Performance Areas by evaluating the impact on the corresponding KPIs Resilience: Assess the resilience of OptiFrame solutions to a representative set of typical disturbances (laid down in operational scenarios) and determine the impact of these disturbances on relevant KPA/KPIs This 25% increase follows from the feedback received on the 5 October stakeholder workshop, where partic ipants indicated that they considered this increase realistic (and larger increased unrealistic). During the 5 October stakeholder workshop it was suggested by participants that a 100% increase of traffic might be used for scientific purposed (not reflecting a realistic scenario, but useful to determine the upper performance bound of the OptiFrame solutions). Resilience: see seciton in D7.
15
2. Define the scope: KPAs and KPIs
5 Oct workshop: ”Expressed views on the relevance of KPAs in relation to TBO indicate that costs are most important for airspace users. Other relevant KPAs include equity, fuel efficiency, capacity, predictability, punctuality, and flexibility.” KPAs and example KPIs: Fuel Efficiency (average fuel burn per phase of flight) Capacity (airspace movements per hour or airport DEP/ARR per hour) Predictability & punctuality (DEP/ARR close to scheduled times of flight) Flexibility (ability of AUs to modify flight trajectories) Cost-Effectiveness (e.g. fuel costs, time costs, ATC-costs, aircraft costs, pax delay costs – as suggested at the workshop) Equity and Fairness (equal access to airspace or services, without benefiting any actors over others) KPA Safety is defined as out of scope. Justification is that (1) safety in terms of numbers of incidents/accidents will be difficult to assess and (2) safety could be considered to be guaranteed when all applicable constraints (objective 1) are complied with.
16
3. Identify Disturbance Categories
Main disturbance categories affecting 4D trajectories under TBO: Uncertainty in weather data (wind, temperature, weather events) leading to uncertainty in ground and airborne predictions Uncertainty in the turn-around process due to irregularities with security, passengers, fuelling, aircraft maintenance, baggage handling, and others Uncertainty in aircraft performance for ground-based operational actors due to variations in pilots’ aircraft handling, airline policies, and unknown aircraft characteristics Uncertainties Uncertainty due to interactions between flights, e.g. their sharing of limited resources like airspace, runways, and ATC capacity Uncertainty due to incomplete synchronisation of information between stakeholders (SWIM) Uncertainty in weather data (wind, temperature, weather events) leading to uncertainty in ground and airborne predictions; Uncertainty in the turn-around process due to irregularities with security, passengers, fuelling, aircraft maintenance, baggage handling, and others; Uncertainty in aircraft performance for ground-based operational actors due to variations in pilots’ aircraft handling, airline policies, and unknown aircraft characteristics; Uncertainty due to interactions between flights, e.g. their sharing of limited resources like airspace, runways, and ATC capacity; Uncertainty due to incomplete synchronisation of information between stakeholders. To deal with changing circumstances and residual uncertainty planned trajectories need to be continuously refined and revised during the execution phase, based on latest data, observations and predictions, in order to continuously find the optimum balance between potentially conflicting demands from different perspectives of the flights (pilots), airlines, airports, flow management, and ATC [26]. Such refinements and revisions may be of three kinds: - Major deviations from a coordinated trajectory, including re-planning of trajectories and trajectory constraints, due to disturbances such as significant unpredicted weather changes, unplanned runway closure, engine failure, and major pre-departure deviations; - Minor deviations from a coordinated trajectory, including some minor adjustments of aircraft performance; - Minor within-tolerance deviations from a coordinated trajectory, which are within the tolerance level of the trajectory and its constraints, implying that only an updated trajectory prediction needs to be synchronised among the stakeholders.
17
4. Define Reference and Solution Scenarios
1. “Reference” scenario: Nominal OptiFrame Scenario, scenario without disturbances Two traffic samples: Busy summer day 2016 (current-day traffic) Forecasted busy day in 2022 (+25% traffic compared to 2016) 2. Solution scenarios: Disturbed OptiFrame Scenario, initial pool: Unforeseen wind changes Airspace restrictions Airport restrictions Aircraft turnaround delay Airport closure Aircraft performance variations Insufficient synchronisation of information
18
5. Refine Operational scenarios for Qualitative Assessment (T6.2)
For T6.2 we refine from the pool of scenarios based on: Capabilities of the WP4 model The demands of qualitative assessment (the agent-based modelling and simulation method) Note: to be done in T6.2
19
6. Qualitative Assessment (T6.2)
Knowledge base (incl. workshop results) Qualitative validation TBO planning model KPAs impact Resilience Reference and solution scenarios Process Qualitative reasoning about the operational impact of the OptiFrame TBO planning models on relevant KPAs: Reference/nominal scenario Disturbance scenarios Results Impact on KPAs, insight into model resilience Feedback to planning model development Input for scoping of quantitative validation
20
7. Refine Operational scenarios for Quantitative Assessment (T6.3)
For T6.3 we refine and select from the pool of scenarios based on: 5 Oct workshop feedback Outcomes of the qualitative validation Capabilities of the WP4 model and WP5 implementation Scenario feedback from the workshop (amongst other comments): Scenario Unforeseen wind changes: by some “not considered relevant enough when related to upper airspace” Scenario Airspace restrictions: considered more relevant if related to: Military airspace restricting or opening up for civil airspace Radar failure, sudden closure of an airspace sector Time periods at which routes are open for civil use. Scenario Airport Closure: “(...) is considered by some experts as rather exceptional and might therefore not be so relevant for the purposes of the OptiFrame project Scenario Aircraft performance variations and scenario Insufficient synchronisation of information: ” Scenarios 5 (aircraft performance variations) and 6 (insufficient synchronisation) should be removed according to some stakeholders Note: to be done in T6.3
21
8. Quantitative Assessment (T6.3)
Validation tools Compliance Performance Scalability KPAs impact Resilience Quantitative validation Calculated TBO plans Reference, nominal and selected disturbance scenarios Process Quantitative evaluation of the operational impact of the optimized TBO plans, using suitable validation tools, on selected KPAs: Reference scenario Nominal scenario Selected disturbance scenarios Results Compliance / Performance / Scalability / KPAs impact / Resilience Input for final evaluation of the Optiframe methods Note: to be done in T6.3
22
QUESTIONS? OptiFrame telecon, 23 November 2016
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.