Presentation is loading. Please wait.

Presentation is loading. Please wait.

CARE/ASAS Validation Framework Guidelines & Case Studies Mark Watson NATS.

Similar presentations


Presentation on theme: "CARE/ASAS Validation Framework Guidelines & Case Studies Mark Watson NATS."— Presentation transcript:

1 CARE/ASAS Validation Framework Guidelines & Case Studies Mark Watson NATS

2 Contents  WP4 & MAEVA VGH  The Validation Framework  The Case Studies

3 Work Package 4 Align previous work packages to MAEVA VGH Write guidelines and include Activity 3 case studies Guideline Report Update EMERALD RTD Plan (presented later)

4 Master ATM European Validation Plan (MAEVA)  European Commission funded 5th framework project  Promote a common framework for validation of 5th FP ATM projects  Proposes top-down approach rather than enabler-targeted bottom-up approach  Describes lifecycle of ATM steps from concept to operational implementation  Wider intended adoption throughout Europe

5 WP1: Initial Validation Framework WP2: System Performance Metrics WP3: Human Performance Metrics Compare To MAEVA VGH Include Activity 3 Case Studies CARE/ASAS Validation Framework Guidelines

6 CARE/ASAS Validation Framework … five steps to enlightenment! Step 1: Identification Of Validation Aims, Objectives And Hypotheses Step 2: Validation Design - Plan & Prepare The Validation Exercise Step 3: Conduct Of Validation Exercise Runs Step 4: Analysis of the Results Step 5: Develop and Report Conclusions & Recommendations but with 16 actions...

7 Action 1. Understanding the ATM problem Action 2. Selection of the ASAS application Action 3. Identification of stakeholders Action 4. Identification of validation aims Action 5. Definition of the high level objectives (HLO) Action 6. Definition of the low level objectives (LLO) Action 7. Establishing validation platform requirements and selection of the validation technique Action 8. Selection of system performance and human performance metrics and hypotheses Action 9. Definition of the high level experimental design Action 10. Operational and statistical significance Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES

8 Step 2: Validation Design - Plan & Prepare The Validation Exercise Action 11 : Selection Of The Validation Platform/Tool Action 12: Scenario Definition Action 13: Production Of Detailed Experiment Design Step 3: Conduct Of Validation Exercise Runs Action 14: Execution Step 4: Analysis of the Results Action 15: Results Analysis Step 5: Develop and Report Conclusions Recommendations Action 16: Conclusion & Recommendations

9 MAEVA VGH vs. CARE/ASAS VF

10 Case Study Examples  Time based sequencing in approach  Airborne Separation category  sequencing & merging operations from Top of Descent until Final Approach Fix  time is the separation criteria  limited separation responsibility delegated to pilot  Airborne separation minima may be lower than ATC separation minima  Mixed levels of ADS-B equipage  Example airspace Madrid

11 Case Study Examples  Airborne Self Separation in Segregated En- route Airspace  Airborne self separation category  Free flight segregated airspace  Aircraft fly preferred route between entry and exit  Flight crews responsible for self-separation from all aircraft  example airspace Mediterranean

12 Order Of The VF Presentations  Scenario Template & Database - Juan Alberto Herreria, ISDEFE  System Performance Metrics - Mike Sharples, QinetiQ  Case Study (Time based sequencing) Actions part 1 - Mark Watson, NATS  Discussion Forum  Lunch  Human Performance Metrics & Experimental Design - Brian Hilburn, NLR  Case Study Actions part 2 - Mark Watson, NATS

13 Coffee Break

14 Case Study of the Validation Framework Time Based Sequencing In Approach

15 The Validation Framework  Step 1: Identification Of Validation Aims, Objectives And Hypotheses (10 actions)  Step 2: Validation Design - Plan & Prepare The Validation Exercise (2 actions)  Step 3: Conduct Of Validation Exercise Runs  Step 4: Analysis of the Results  Step 5: Develop and Report Conclusions & Recommendations

16 Action 1. Understanding the ATM problem Action 2. Selection of the ASAS application Action 3. Identification of stakeholders Action 4. Identification of validation aims Action 5. Definition of the high level objectives (HLO) Action 6. Definition of the low level objectives (LLO) Action 7. Establishing validation platform requirements and selection of the validation technique Action 8. Selection of system performance and human performance metrics and hypotheses Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES

17 Action 11: Selection Of The Validation Platform/Tool Action 12: Scenario Definition Action 13: Production Of Detailed Experiment Design Step 2: VALIDATION DESIGN - PLAN & PREPARE THE VALIDATION EXERCISE Action 9. Definition of the high level experimental design Action 10. Operational and statistical significance

18  Constrained capacity on approach within TMA airspace. Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 1. Understanding the ATM problem Action 2. Selection of the ASAS application  Increase capacity on approach by aircraft flying the minimum aircraft separation.  Selected ASAS application is Time Based Sequencing In Approach.  Separation responsibility should be delegated to the pilot to decrease controller workload.  Maintain present level of safety.

19 Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 3. Identification of stakeholders Action 4. Identification of validation aims  Airline operator  Pilot  ATSP  Airport Operator  ATCO  Assess the application for its effect on capacity in TMA on approach.  Assess the impact on controller and pilot workload and TMA capacity.

20 Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 5: Definition of the High-Level Objectives (HLO) Action 6: Identification of Low-Level Objectives (LLO)  Safety  Capacity  Economics  Airspace throughput  Controller & Pilot Workload  Voice Communications  Conflicts  Traffic densities

21 Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 7: Establishing Validation Platform Requirements and selection of validation technique  Scope of ATM system  Fidelity/Resolution  Geography  Time-based Requirements

22 Action 8: Identification of System Performance and Human Performance Metrics & Hypotheses Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES SYSTEM:  Planned versus Actual Flight Profiles  Sector Entry/Exit  Conflicts  Workload per controller  Number of Time Based Clearances

23 Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Safety Perspective (capacity & efficiency) Action 8: Identification of System Performance and Human Performance Metrics & Hypotheses(2 of 4)

24 ATSP Perspective (capacity & efficiency) Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 8: Identification of System Performance and Human Performance Metrics & Hypotheses(3 of 4)

25

26 Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 8: Identification of System Performance and Human Performance Metrics & Hypotheses(4 of 4) - IF IT WAS A REAL TIME SIMULATION!!  Pilot metric to assess peak workload  Various performance based and physiological based objective measures are available  Physiological measures of EEG potentials (brainwaves) dismissed as too intrusive. Pupil diameter is therefore chosen

27 Action 9: Definition of High Level Design Step 1: IDENTIFICATION OF VALIDATION AIMS, OBJECTIVES AND HYPOTHESES Action 10: Operational and Statistical Significance  Initial 2005 baseline sample with no ASAS application to prove representativeness.  Three measured runs - 2005, 2010, 2015.  Three levels of separation delegation for each run.  2005 measured run with minimum separation delegation  decreased controller and communications workload by 5%  All other measured runs must improve on this.  95% statistical significance required

28 Action 11: Selection Of Platform/Tool Step 2: Plan and Prepare the Validation Exercise Action 12: Scenario Definition  MAEVA VGH describes available European platforms and their capability.  Suitability of TAAM for addressing HLO of Economics and Capacity through fast time simulations.  Adaptable to the airspace of this validation exercise.  Safety can be addressed through analysis of results.  Use scenario template as aide-memoir  Helps develop a detailed scenario definition document

29 Action 13: Production Of Detailed Experimental Design Step 2: Plan and Prepare the Validation Exercise  Detailed planning of the exercise runs.  Preparation of the Measurement and Analysis Specification.

30 and finally… Conclusions  CARE/ASAS VALIDATION FRAMEWORK closely aligns with MAEVA VGH  (some interim steps differ in order or are tailored)  Step by Step route map for the creation of validation exercises for any ASAS application  An iterative process of design  sufficient detail for organisations with limited ASAS or validation experience  Will encourage uniformity of ASAS validations

31 Forum Discussion


Download ppt "CARE/ASAS Validation Framework Guidelines & Case Studies Mark Watson NATS."

Similar presentations


Ads by Google