Presentation is loading. Please wait.

Presentation is loading. Please wait.

Integrity Overview Todd Walter Stanford University Bruce DeCleene FAA

Similar presentations


Presentation on theme: "Integrity Overview Todd Walter Stanford University Bruce DeCleene FAA"— Presentation transcript:

1 Integrity Overview Todd Walter Stanford University Bruce DeCleene FAA

2 Purpose Provide a description of the process the FAA used to approve the design of the safety monitors in WAAS

3 Overall Philosophy Traditional Differential GPS Systems Rely on Lack of Disproof Anecdotal evidence of no problems 10-7 Integrity Requires Active Proof Analysis, Simulation, and Data Must Each Support Each Other None sufficient by themselves Clear Documentation of Safety Rationale is Essential

4 Integrity Triad Data cannot prove 10-7
Theory may miss “real-world” effects Simulation only tests specific scenarios All 3 together! Data supports theory Theory extends data Simulation validates implementation

5 Lessons Learned through WIPP Process
Integrity Requirement of 10-7 Applies to Each and Every Approach Threat Models Required to Judge System Performance and Safety System Must Be Proven Safe Rationale/evidence for safety claim Small Probabilities Are Not Intuitive Probabilities should be calculated before being dismissed as sufficiently remote

6 Interpretation of “Probability of HMI < 10-7 Per Approach”
Possible Interpretations Ensemble Average of All Approaches Over Space and Time Ensemble Average of All Approaches Over Time for the Worst Location Previous Plus No Discernable Pattern (Rare & No Correlation With User Behavior) Worst Time and Location

7 HMI Vs. Time & Space

8 WIPP Interpretation Events Handled Case by Case
Events That Are Rare and Random May Take Advantage of an A Priori Deterministic Events Must Be Monitored or Treated as Worst-Case Events That Are Observable Must Be Detected (If PHMI > 10-7) Must Account for Worst-Case Undetected Events

9 Ionospheric Example Ionospheric Storms Will Occur
Onset of Storm Is Rare and Random A priori used in storm detector Ionospheric Storm is Observable Must have a monitor Before detector trips, assume ionosphere is in a near storm state After detector trips, assume worst case ionospheric state

10 Threat Models Not Fully Defined for Prior to WIPP
Limit Extent of Threats Provide description and likelihood Basis for Judging Completeness Monitors must be shown to mitigate full threat space to sufficient probability Models Must Be Comprehensive Collectively full set must span all feared events

11 SBAS Threats List is not comprehensive Satellite Ionosphere
Clock/Ephemeris Error Signal Deformation Nominal Faulted Code Carrier Incoherency Ionosphere Local Non-Planar Behavior Well-sampled Undersampled Troposphere Receiver Multipath Thermal Noise Antenna Bias Survey Errors Receiver Errors Master Station SV Clock/Ephemeris Estimate Errors Ionospheric Estimation Errors SV Tgd Estimate Errors Receiver IFB Estimate Errors WRS Clock Estimate Errors Communication Errors Broadcast Errors User Errors List is not comprehensive

12 CONUS Ionosphere Threat
Not Well-Modeled by Local Planar Fit Ionosphere well-sampled1 Ionosphere poorly sampled2 Ionosphere Changes Over the Lifetime of the Correction User Interpolation Introduces Error 1. “Robust Detection of Ionospheric Irregularities,” Walter et al. ION GPS 2000 2. “The WAAS Ionospheric Threat Model,” Sparks et al. Beacon Symposium 2001

13 Rationale/Evidence for Safety
System must be proven safe Each threat requires a mitigation Must be agreed to by the entire group Must be fully and clearly documented for future reference and modification Small probabilities must be calculated Product of a priori and missed detection rate must meet allocation Sum of all threats must meet total 10-7 per approach requirement

14 Fault Tree PHMI is calculated using a fault tree
Two main algorithm branches Single fault dominates Multiple faults convolve together Cases are mutually exclusive Single fault worries about the tails of the distribution Multiple fault worries about the means Separate monitors and analyses

15 Data Analysis Large amount of data is essential for verifying performance Under many different conditions Data must be sliced to look for systematic errors Slices partition data into logical subsets Multipath data may be sliced by elevation angle Iono data may me sliced by time of day Errors are not stationary

16 Overbound True distributions are messy
Require an analytically tractable distribution (gaussians are good) Analytic distribution must predict probability of large errors at least as great (or greater than) true distribution

17 Gaussian Tail Behavior
Zero-Mean, Unit-Variance Gaussian Probability that |X| > K is shown in plot Probability of Error Falls Off Rapidly As Mag-nitude Increases

18 Overbounding Tools CDF Bounding - (DeCleene 2000)
Real distributions must be zero-mean, symmetric, and unimodal Gaussian Bounding - (Raytheon 2002) Allows for non-zero means, but still requires symmetry and unimodality Paired Bounding - (Rife 2004) Allows asymmetry, multiple modes, and non-zero means Excess-Mass Bounding - (Rife 2004) Similar to paired bounding Core Bounding - (Rife 2004) Advantages as above, but allows for uncertain tails Moment Bounding - (Raytheon 2002) Unlike others, operates in moment domain Less intuitive, but works well with real data

19 Overbounding Tools Single CDF Paired CDF Moment Bounding Core Bounding
Excess-Mass CDF Excess-Mass PDF

20 Overbound Validation Key characteristic: Overbound application
All of these techniques enable overbound validation If overbound conservatively represents correction domain error, it will conservatively represent derived errors (i.e UDREs, GIVEs, Protection Levels) Overbound application Different techniques have different strengths and weaknesses Need to find method that yields best performance

21 Mixing and Clipping System Is Not Stationary
Certain events lead to larger errors If not recognized, sampled distribution mixes errors with different properties Observed tails worse than Gaussian Reasonability Checks and Redundancy Remove Large Errors Both faulted and large fault-free errors Observed tails better than Gaussian Mixing Dominates Over Clipping

22 Mixed Gaussian Distribution

23 Monitor Observability
Noise on the measurements limits the ability to detect errors Only errors above a certain limit can be detected with a certain probability Smaller errors are assumed present GIVEs and UDREs must increase as system noise increases Larger errors may escape detection

24 Errors Below the Detection Threshold

25 Failure Rates and Monitors

26 Range Domain Vs. Position Domain
Position Domain is Intuitive Common errors fall in clock Only tests specific geometry Range/Correction Domain Generally More Conservative Protects each individual correction Applicable to all geometries Small correlated errors may cause HMI Approaches Complement Each Other

27 Example User Geometry WAAS user near Florida 8 satellites in view
Weak vertical dependence on PRN 6 for all-in-view Strong dependence if PRN 8 is missing 25 m bias on PRN 6 creates a 4 m error for all-in-view, and a 50 m error without PRN 8

28 Vertical Performance Each User Has a Single Biased SV
Change in Geometry Exposes Error

29 Conclusions Probability of HMI Applies to Worst Predictable Condition
Data, Theory, and Simulation must Support each other Threat Models Essential for Validating Implementation Errors Below Detection Threshold Must Be Treated Conservatively Position and Range Domain Needed to Protect All Geometries

30 Resources (1 of 3) Cabler, H. and DeCleene, B., “LPV: New, Improved WAAS Instrument Approach,” in Proceedings of the ION GPS meeting, Portland, OR, 2002. Walter, T., “WAAS MOPS: Practical Examples,” in Proceedings of the National Technical Meeting of the Institute of Navigation, San Diego, January 1999. Walter, T., Enge, P., and DeCleene, B., “Integrity Lessons from the WIPP,” in Proceedings of the National Technical Meeting of the Institute of Navigation, Anaheim, January 2003. Shallberg, K., Shloss, P., Altshuler, E., and Tahmazyan,. L., “WAAS Measurement Processing, Reducing the Effects of Multipath,” in Proceedings of the ION GPS meeting, Salt Lake City, UT, 2001. Hansen, A., Blanch, J., Walter, T., and Enge, P., “Ionospheric Correlation Analysis for WAAS: Quiet and Stormy,” in Proceedings of the ION GPS meeting, Salt Lake City, UT, 2000. Rajagopal, S., Walter, T., Datta-Barua, S., Blanch, J., and Sakai, T., “Correlation Structure of the Equatorial Ionosphere,” Proceedings of the National Technical Meeting of the Institute of Navigation, San Diego, January 2004. Walter, T., Hansen, A., Blanch, J., Enge, P., Mannucci, T., Pi, X., Sparks, L., Iijima, B., El-Arini, B., Lejeune, R., Hagen, M., Altshuler, E., Fries, R., and Chu, A., Robust Detection of Ionospheric Irregularities, Navigation: Journal of The Institute of Navigation, Vol. 48, No. 2, Summer 2001. Blanch, J., Using Kriging to bound Satellite Ranging Errors due to the Ionosphere, Stanford University Thesis, December, 2003, available through Sparks, L., Pi, X., Mannucci, A.J., Walter, T. Blanch, J., Hansen, A., Enge, P., Altshuler, E., and Fries, R., “The WAAS Ionospheric Threat Model,” in Proceedings of the Beacon Satellite Symposium, Boston, MA, June 2001.

31 Resources (2 of 3) Datta-Barua, S., “Ionospheric Threats to Space-Based Augmentation System Development,” in Proceedings of the ION GPS meeting, Long Beach, CA, 2004. Walter, T., Rajagopal, S., Datta-Barua, S., and Blanch, J., “Protecting Against Unsampled Ionospheric Threats,” in Proceedings of the Beacon Satellite Symposium, Trieste, Italy, October 2004. Wu, T. and Peck, S., “An Analysis of Satellite Integrity Monitoring Improvement for WAAS,” in Proceedings of the ION GPS meeting, Portland, OR, 2002. Walter, T. Hansen, A., and Enge, P., “Message Type 28,” in Proceedings of the National Technical Meeting of the Institute of Navigation, Long Beach, CA, January 2001. FAA/William J. Hughes Technical Center, “Wide-Area Augmentation System Performance Analysis Reports, available at Jan, S. S., Chan, W., Walter, T., and Enge, P., “Matlab Simulation Toolset for SBAS Availability Analysis,” in Proceedings of the ION GPS meeting, Salt Lake City, UT, 2001. Walter, T. and Enge, P., “Modernizing WAAS,” in Proceedings of the ION GPS meeting, Long Beach, CA, 2004. DeCleene, B., “Defining Psuedorange Integrity - Overbounding,” in Proceedings of the ION GPS meeting, Salt Lake City, UT, 2000. Schempp, T. R. and Rubin, A. L., “An Application of Gaussian Bounding for the WAAS Fault-Free Error Analysis,” n Proceedings of the ION GPS meeting, Portland, OR, 2002. Rife J, Pullen S, Pervan B, and Enge P., “Paired overbounding and application to GPS augmentation,” Proceedings IEEE Position, Location and Navigation Symposium, Monterey, California, April 2004.

32 Resources (3 of 3) Rife J, Pullen S, Pervan B, and Enge P., “Core overbounding and its implications for LAAS integrity,” Proceedings of ION GNSS 2004, Long Beach, California, September, 2004. Rife J, Walter T, and Blanch J, “Overbounding SBAS and GBAS error distributions with excess- mass functions,” Proceedings of the 2004 International Symposium on GPS/GNSS, Sydney, Australia, 6-8 December, 2004. Walter, T., Blanch, J., and Rife, J., “Treatment of Biased error distributions in SBAS,” Proceedings of the 2004 International Symposium on GPS/GNSS, Sydney, Australia, 6-8 December, 2004. Most Stanford Publications available at: Todd Walter: Jason Rife: Juan Blanch:


Download ppt "Integrity Overview Todd Walter Stanford University Bruce DeCleene FAA"

Similar presentations


Ads by Google