SBAS and GBAS Integrity for Non- Aviation Users: Moving Away from "Specific Risk" ION ITM 2011San Diego, CA. 25 January 2011 Sam Pullen, Todd Walter, and.

Slides:



Advertisements
Similar presentations
Agenda Item 6 GNSS Operations Ross Bowie, NAV CANADA Rapporteur, Operational WG Navigation Systems Panel Thank you… Good morning… I am ... and member.
Advertisements

International Civil Aviation Organization
Report on SBAS Ionospheric Working Group Todd Walter Stanford University Todd Walter Stanford University
Navigation solutions powered by Europe SUPPORT TO IWG25 12 th June 2013.
© 2013 The MITRE Corporation. All rights reserved. Tim Cashin, Dmitri Baraban, Roland Lejeune SBAS IWG #24 Meeting CNES, Toulouse, France January.
1 Chapter 9 Hypothesis Testing Developing Null and Alternative Hypotheses Type I and Type II Errors One-Tailed Tests About a Population Mean: Large-Sample.
Loran Integrity Performance Panel Loran Integrity & Performance Panel (LORIPP) Per Enge, Stanford University, November 2003 Based on the work of: Federal.
Data preprocessing before classification In Kennedy et al.: “Solving data mining problems”
2008 SIAM Conference on Imaging Science July 7, 2008 Jason A. Palmer
Aviation Benefits of GNSS Augmentation Workshop on "GNSS Applications for Human Benefit and Development“ Prague, Czech Republic September 2010 Jeffrey.
Multiple Criteria for Evaluating Land Cover Classification Algorithms Summary of a paper by R.S. DeFries and Jonathan Cheung-Wai Chan April, 2000 Remote.
EPIDEMIOLOGY AND BIOSTATISTICS DEPT Esimating Population Value with Hypothesis Testing.
1 Modelled Meteorology - Applicability to Well-test Flaring Assessments Environment and Energy Division Alex Schutte Science & Community Environmental.
Absolute Receiver Autonomous Integrity Monitoring (ARAIM)
1 1 Slide © 2006 Thomson/South-Western Chapter 9 Hypothesis Testing Developing Null and Alternative Hypotheses Developing Null and Alternative Hypotheses.
Inferences About Process Quality
Annex I: Methods & Tools prepared by some members of the ICH Q9 EWG for example only; not an official policy/guidance July 2006, slide 1 ICH Q9 QUALITY.
Federal Aviation Administration GPS Augmentation Systems Status Leo Eldredge, GNSS Group Federal Aviation Administration (FAA) September 2009.
Satellite-Based Augmentation Systems (SBAS) Combined Performance
Aviation Considerations for Multi-Constellation GNSS Leo Eldredge, GNSS Group Federal Aviation Administration (FAA) December 2008 Federal Aviation Administration.
Aviation Benefits of GNSS Augmentation Workshop on the Applications of GNSS Chisinau, Moldova May 2010 Jeffrey Auerbach Advisor on GNSS Affairs Office.
V. Rouillard  Introduction to measurement and statistical analysis ASSESSING EXPERIMENTAL DATA : ERRORS Remember: no measurement is perfect – errors.
Copyright © Cengage Learning. All rights reserved. 8 Tests of Hypotheses Based on a Single Sample.
Frequency analysis and scenario development
Introduction Dual Frequency SBAS = The solution for Ionosphere:
A Comparative Overview of the Protection Level Concept for Augmented GNSS and LORAN Stanford University GPS Laboratory Weekly Meeting 20 December 2002.
India GAGAN – Adoption within Asia Pacific Region Plan/Opportunities.
Page 1 SQM: SBAS Workshop ZETA ASSOCIATES 21 June 2005.
Introduction SBAS Selection Problem: Necessity of Monitoring:
FAA GNSS Evolutionary Architecture Study
October 5, 2007 By: Richard L. Day, Vice President En Route and Oceanic Services (ATO-E) Federal Aviation Administration Surveillance and Broadcast Services.
Integrity Overview Todd Walter Stanford University Bruce DeCleene FAA
1 1 Slide © 2003 Thomson/South-Western Slides Prepared by JOHN S. LOUCKS St. Edward’s University.
Using Outage History to Exclude High-Risk Satellites from GBAS Corrections ION GNSS 2011Portland, Oregon Session C5-823 September 2011 Sam Pullen and Per.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
WAAS Test Program Overview & In-service Monitoring Tom McHugh, ATO-P (ACB-430) DOT/FAA W.J.H. Technical Center WAAS Technical Director
© 2011 The MITRE Corporation. All rights reserved. Advisory Vertical Guidance (Advisory VNAV) A Useful and Flexible Safety Enhancement S.V. Massimini,
Development and evaluation of Passive Microwave SWE retrieval equations for mountainous area Naoki Mizukami.
Using SBAS to Enhance GBAS User Availability: Results and Extensions EIWAC 2010Tokyo, Japan Paper EN November 2010 Sam Pullen, Ming Luo, Todd Walter,
Shankar Ramakrishnan, Jiyun Lee, Sam Pullen and Per Enge
Wide Area Augmentation System Dan Hanlon WAAS Program Manager April 2, 2003.
Chapter 4 Control Charts for Measurements with Subgrouping (for One Variable)
Uncertainty How “certain” of the data are we? How much “error” does it contain? Also known as: –Quality Assurance / Quality Control –QAQC.
Copyright © 2012 Pearson Education. All rights reserved © 2010 Pearson Education Copyright © 2012 Pearson Education. All rights reserved. Chapter.
Hazard Identification
1 Loran-C User Position Software (LUPS) Navigation Performance with the August 2001 Cross-Country/Alaska Flight Test Data Jaime Y. Cruz and Robert Stoeckly.
Ionospheric Integrity Lessons from the WIPP Todd Walter Stanford University Todd Walter Stanford University
182a_N00FEB23_DG 1 Local Area Augmentation System CONCEPT OF OPERATIONS Alaska Regional Briefing Anchorage October 1, 2002.
June 2013 Global SBAS Status Satellite Based Augmentation System (SBAS) Interoperability Working Group (IWG) June 2013.
Idaho RISE System Reliability and Designing to Reduce Failure ENGR Sept 2005.
The Wide Area Augmentation System (WAAS) Todd Walter Stanford University Todd Walter Stanford University
Federal Aviation Administration FAA Global Navigation Satellite System (GNSS) Program Plans and Status GPS/WAAS/LAAS Leo Eldredge, GNSS Program Manager.
Global SBAS Status Satellite Based Augmentation System (SBAS) Interoperability Working Group (IWG) November 2013.
F E D E R A L A V I A T I O N A D M I N I S T R A T I O N A I R T R A F F I C O R G A N I Z A T I O N 1 FAA Satellite Navigation Program Update Dan Salvano.
Biostatistics Case Studies 2006 Peter D. Christenson Biostatistician Session 2: Correlation of Time Courses of Simultaneous.
UNECE – SC2 Rail Security Analysis and economic assessment of rail transport security 1st October 2009 Andrew Cook.
Allocation of Unidentified Gas Statement – Interim Report findings 17 th September 2012.
LAAS Ionosphere Anomaly Prior Probability Model: “Version 3.0” 14 October 2005 Sam Pullen Stanford University
Fault Free Integrity of Mid-Level Voting for Triplex Carrier Phase Differential GPS Solutions G. Nathan Green – The University of Texas at Austin Martin.
Structural & Multidisciplinary Optimization Group Deciding How Conservative A Designer Should Be: Simulating Future Tests and Redesign Nathaniel Price.
GPS Modernization & WAAS
Single Season Study Design. 2 Points for consideration Don’t forget; why, what and how. A well designed study will:  highlight gaps in current knowledge.
ASSTAR Oceanic Session Summary
EUROPEAN COMMISSION Satellite-Based Augmentation Systems (SBAS) Combined Performance International Committee on GNSS (ICG-4) Working Group A Saint Petersburg,
Air Carrier Continuing Analysis and Surveillance System (CASS)
Assistant Professor in the TELECOM Group
Pest Risk Analysis (PRA) Stage 2: Pest Risk Assessment
Loran Integrity & Performance Panel (LORIPP)
International Civil Aviation Organization
Presumptions Subgroups (samples) of data are formed.
Presentation transcript:

SBAS and GBAS Integrity for Non- Aviation Users: Moving Away from "Specific Risk" ION ITM 2011San Diego, CA. 25 January 2011 Sam Pullen, Todd Walter, and Per Enge Stanford University

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"2 Motivation (1): SBAS and GBAS for Non- Aviation Users Where augmentation signals can be received, SBAS and GBAS benefits are available to all users. However, integrity algorithms in airborne MOPS are designed to support specific aviation applications. –Resulting integrity protection levels are not well- suited for other classes of users Correcting this would increase the attractiveness of SBAS and GBAS to non-aviation transport users (auto, rail, marine) and others.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"3 Motivation (2): Accuracy and Integrity Accuracy bounds (e.g., 95% vertical position error, or VPE) can be measured and modeled with high precision Integrity bounds (e.g., vertical protection level, or VPL) cannot be –Lack of sufficient measurements –Flaws in Gaussian extrapolations to low probabilities –Dependence on details of failure models and assumptions –Too little is known; too much is uncertain… Illustrative example – not to scale or direction 95% HPE HPL (per MOPS) HPL (non- aviation application)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"4 WAAS VPE vs. VPL from FAA PAN Data (3 rd Qtr 2010: July – Sept.) Source: WAAS PAN Report #34, Oct DisplayArchive.htm VPL (m) VPE (m) Max. VPE  7 m (at Barrow, AK) 95% VPE  1.2 m 99% VPE  1.6 m

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"5 WAAS Reference Station Classifications (for this study only) Figure source: FAA GNSS Press Kit 7 Inner Stations 13 Outer Stations 18 Remote Stations

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"6 Max. VPE and VPL from FAA PAN Data (1 Jan – 30 Sept. 2010) Worst Case Between “Inner” and “Outer” WAAS Stations  “InOut” Set

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"7 Max. HPE and HPL from FAA PAN Data (1 Jan – 30 Sept. 2010) Worst Case Between “Inner” and “Outer” WAAS Stations  “InOut” Set One unusual result: 12 m error at Cleveland in Spring 2005 (correct number?) As expected, both HPE and HPL are significantly lower than VPE and VPL.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"8 Ratio of Max. VPL and Max. VPE from FAA PAN Data (“InOut” Station Set)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"9 Ratio of Max. HPL and Max. HPE from FAA PAN Data (“InOut” Station Set) Unusual error at Cleveland (if correct) just barely exceeded by HPL.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"10 How Many Samples Were Collected? 4.25  sec (49,324.6 days) ( years) All validated PAN data from 1 Jan to 30 Sept Assume data correlated over 30 sec 1.4  independent samples Assume data correlated over 150 sec (~ one CAT I approach) 2.8  independent samples Assume data correlated over 600 sec (10 min) 7.1  independent samples

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"11 Average vs. Specific Risk Assessment Average Risk (my definition): the probability of unsafe conditions based upon the convolved (“averaged”) estimated probabilities of all unknown events. –Probabilistic Risk Analysis (PRA) is based on this procedure –Risk aversion and value of information (VOI) are applied to the outputs of PRA  integrity risk requirements, alert limits Specific Risk (my definition): the probability of unsafe conditions subject to the assumption that all (negative but credible) unknown events that could be known occur with a probability of one. –Evolved from pre-existing FAA and ICAO safety standards –Risk aversion and VOI and buried inside specific risk analysis –Results (risk and protection levels) are inconsistent with PRA

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"12 Simplified Example: Ionospheric Spatial Decorrelation (1) 20:15 UT21:00 UT Severe Ionospheric Storm Observed over CONUS on 20 November 2003

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"13 Simplified Example: Ionospheric Spatial Decorrelation (2) Using PRA, estimated “prior” probabilities of severe decorrelation are combined with the likelihood of SBAS or GBAS mitigation to derive resulting user risk. –Prior probabilities need not be known precisely –Benefits of improved mitigation (“better information”) appear naturally as lower integrity risk. Under FAA interpretation of Specific Risk, worst-case iono. delay gradient is “credible” and thus is assigned a probability of one. –Worst-case for GBAS (CAT I): an extremely large gradient that escapes detection by “matching speed” with ground station »This differs in real time for each site and GNSS geometry –Worst-case for SBAS (LPV): a very large gradient that is just small enough to avoid detection by master station

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"14 Simplified Example: Ionospheric Spatial Decorrelation (3) User Vertical Position Error (meters) PDF Worst-case error, or “MIEV”, is  41 m Most errors are exactly zero due to ground detection and exclusion, but all zero errors have been removed from the histogram. Simulated results for Memphis GBAS impacted by severe ionospheric gradient (RTCA 24-SV GPS, 6-km, User-to-ground separation, 1 and 2-SV impacts) Most plotted (non-zero) errors are below 10 m even under severe conditions.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"15 Benefits of an “Average Risk” Approach (Potential SBAS PL Reduction) “Average risk” approach supports large reductions in HPL and VPL implied by WAAS PAN data, pending more complete database analysis. Use “full-scale” PRA to re-assess “rare-normal” and faulted errors PAN Report Number VPL or HPL (meters) 95% VPL 95% HPL Adjusted VPL Adjusted HPL From reports since Jan Max. 95% PLs among stations in CONUS (“InOut” set) Conservative reduction factors from PAN data: VPL / 4.0 HPL / 2.5

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"16 A Combined “Average/Specific” Risk Approach Depending on user and decision maker risk aversion, separate “average risk” and “specific risk” integrity requirements could be issued. –Both apply at all times  one or the other will tend to dominate for a particular application. For example: integrity risk per operation (“average”) plus requirement that a worst-case undetected condition cannot increase the total vehicle loss risk by more than a factor of 10. –For aircraft case, factor of 10 increase in total risk equates to specific risk requirement of per operation for nav. system (more strictly, 9  ) –Specific factors for each vehicle and application would vary. –There is no “correct” degree of risk aversion.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"17 Summary Existing integrity assurance procedures for SBAS and GBAS are unique to aviation and its history and may not be suitable for other users. SBAS (and GBAS) data analysis suggests that HPL and VPL can be greatly reduced if “average risk” approach is taken. –Examination of past data is useful, but more thorough PRA analysis should be conducted. If worst-case elements of risk assessment are still desired, an average/specific risk mixture can be used. –This flexible “mixture” capability should satisfy almost any level of user and decision maker risk aversion.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"18 Backup Slides follow…

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"19 WAAS VPE from FAA PAN Data (3 rd Qtr 2010: July – Sept.) Source: WAAS PAN Report #34, Oct DisplayArchive.htm Max. VPE  7 m at Barrow, AK VPE (m) No. of Samples Meas. from 37 WAAS stations

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"20 Example Error Table from PAN #34 (from PAN #34)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"21 Max. VPE and VPL from WAAS PAN Data (1 Jan – 30 Sept. 2010) (all numbers are in meters)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"22 95% and Max. VPE from FAA PAN Data (1 Jan – 30 Sept. 2010) Quarterly PAN Report Number (8 – 34) Vertical Position Error (meters) VAL for LPV Note: VPL always bounds VPE. Remote Stations Outer Stations Inner Stations 95 % VPE Max. VPE Severe iono. scintillation in Alaska in March and May 2007 (user receiver should prevent)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"23 WAAS VPE vs. VPL in CONUS (2003 – 2006) ( from Wanner, et al, 2006) Vertical Position Error (meters) 99.99% VPE 99.9% VPE 95% VPE 99% VPE Mean VPE 1  VPE Ratios :

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"24 WAAS Max. VPE in CONUS (2003 – 2006) (from Wanner, et al, 2008)

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"25 An “Average Risk” Approach to SBAS (and GBAS) – word version Data imply an “average risk” equivalent VPL for WAAS ~ 4 – 5 times lower than current value. Re-assess “rare-normal” and faulted error models and data to build a “certifiable” safety case. –Multiple rare-normal (“fault-free”) models built from existing data to incorporate remaining uncertainty –All fault-mode analyses follow the same approach: Estimate prior fault probabilities and probability uncertainties. Simulate all significant variations of each fault type rather than “worst case” focus  convolve with prior dist. to estimate risk. –Faults whose impact is driven by worst-case scenarios (ionosphere, signal deformation) will become less important. –Multiple-fault scenarios neglected as too improbable may become more important, as probabilistic weighting of risk may show that fault-combination cases are non-negligible.

25 January 2011Integrity for Non-Aviation Users: Moving Away from "Specific Risk"26 A Combined “Average/Specific” Risk Approach (1) Derived from FAA “Hazard Risk Model” (1) and Simplified Aircraft Accident Risk Breakdown (2) (1) FAA System Safety Handbook, (2) R. Kelly and J. Davis, “Required Navigation Performance (RNP),” Navigation, Spring Catastrophic (Likely a/c hull loss) Hazardous (Risk of a/c loss; Severe loss of safety margin) Major (Slight risk of aircraft loss/pilot challenged) Overall a/c loss prob. Loss prob. due to equipment failure ~ 10% ~ 1% (~ 100 systems) Loss prob. due to GNSS nav. failure ~ 1%