Download presentation
Presentation is loading. Please wait.
1
EUROCONTROL Navigation Domain
The Data Collection Network: EGNOS revealed GNSS 2004 – Rotterdam, The Netherlands Santiago Soley, PILDO Labs S.L. Link to general SAM presentation. We’ve seen the general method, now let’s see why and how it can be applied to GBAS
2
EUROCONTROL: GNSS-1 Operational Validation The Data Collection Network
Summary EUROCONTROL: GNSS-1 Operational Validation The Data Collection Network A Methodology to assess EGNOS performance First Glance Anomaly Investigation Towards the Final Results Conclusions and Future Work
3
EUROCONTROL 31 Member States
The European Organisation for the Safety of Air Navigation The European Organisation for the Safety of Air Navigation 32 Member States 31 Member States
4
GNSS-1 Operational Validation
EUROCONTROL will coordinate the GNSS-1 Operational Validation: To support the Member States in achieving the approval for the provision of EGNOS services throughout ECAC (European Civil Aviation Conference) Two distinct areas of work: Technical Validation of the performance SARPS requirements – RNP values MOPS DO229C compliant receiver Definition of operational rules and procedures for aircraft to use the system for a particular application (APV implementation)
5
The Data Collection Network: Objectives
Collect ESTB/EGNOS Data and Evaluate them from different sites in Europe Group of experts to support: the development of the tools need in the future EGNOS Operational Validation - PEGASUS definition of data collection and processing procedures Baseline for the future static campaign in the GOV Plan – extended with the Air Navigation Service Providers participation
6
The Network IOR PRN131 Tromsö ESTB reference station (RIMS) Höfn
EURIDIS reference station ESTB processing facilities (CPF, MCC) NLES Tromsö Höfn Hönefoss GOV ANSP IOR PRN131 Rotterdam The Data Collection Network Scilly Toulouse Lisboa Barcelona Fucino Ankara Palma Kourou Malaga Matera Canaries Hartebeesthoek
7
The Methodology Actual performance Confidence System behaviour
Continuity Accuracy Availability Integrity Data sample filtering (plausibility rules/flag) Actual performance SARPS Thresholds event counting visualisation position domain Processed data samples parameters Anomaly investigation System analysis System behaviour file watch MT ionosphere ranges clock multipath etc. Extrapolation Simulation Confidence Integrity (range) RNP availability Continuity
8
Check against RNP parameters Proposed algorithms
A First Glance A C C U R A C Y A V A I L A B I L I T Y C O N T I N U I T Y I N T E G R I T Y Check against RNP parameters Proposed algorithms single site/day test fail or pass EGNOS post-SIS1 EEC, April 23rd, 2004
9
Summary of results – Vertical Position Error 95%
10
Summary of results – APV-II Availability (AL 20m)
11
Anomaly Investigation
What to do if some anomalies appear that makes some of the test fail? Just a first check on the obtained performance is not enough to declare the system not compliant with the requirements detailed analysis on what caused a test fail Different methods and techniques used by the Network
12
Example 1: Jump on the VPE
Jump on the position error periodically experienced Delft, December 26th, 2002
13
Example 1: Jump on the VPE
Multipath Mc combination Delft, December 26th, 2002
14
Example 2: Integrity Failures
Integrity failures: HPE>HPL Barcelona, February 6th, 2003
15
Example 2: Integrity Failures
MT02/03 anomalies Psedorange correction oscillations broadcast in MT02 and MT03 UDREI<12 IODF<3 Barcelona, February 6th, 2003
16
Example 3: Integrity Failures
MT26 anomalies 60 MI in about 500 epochs Ionospheric corrections for all visible satellites PRN17 in red Barcelona, September 12th, 2002
17
Example 4: Integrity Failures
UPC1: Vertical Component 8 consecutive MI Prefit- residuals Barcelona, May 22th, 2003
18
(IODF <3 & UDREI<15)
Example 4: Integrity Failures satellite clock jump Fast Corrections: UDREI UDREI=14 PRN 29 NO Alarm Conditions (IODF <3 & UDREI<15) 9 s PRN 29 Sudden jump in C1 code Pseudorange for PRN 29. The ESTB declares the sat. as NOT MONITORED (UDREI=14) 9 sec. after. Prefit- residuals Barcelona, May 22th, 2003
19
Example 5: Integrity Failures
VPE/VPL Large number of MI 19:17 20:40 Pref-Res PRN11 PRN16 PRN02 Prefit- residuals Barcelona, November 20th, 2003
20
Example 5: Integrity Failures
PRN16: C1-Rho (shifted) PRN16: L1-L2 (shifted) STEC (m L1) PRN16: P2-P1 (shifted) PRN16: STEC +/- UIRE (ESTB) 19:17 20:40 Barcelona, November 20th, 2003
21
Example 5: Integrity Failures Ionospheric storm
LOW Medium High Very High Extreme October 29-31th November 20-21th 18:00 20:00 Kp index from Oct. 17th up to Dec. 1st 2003 22:00
22
Towards the final results
To validate SARPS requirements in a reasonable period of time just the data collected from the network would not be enough simulations or other methods – get up to the required level of confidence from the measurements Impossible to do it for all locations under all conditions extrapolation of the local results from the network sites Global Assessment
23
The Global Monitoring System
GPS networks LINUX 24h RINEX files Automatic data gathering BRUS PEGASUS 24h GPS and GEO binary data Automatic daily processing INTERNET Slog Automatic data collecting Receiver Automatic results presentation WEB server
24
The Global Monitoring System
Cross-check with simulations Network results extrapolation Global Assessment HPE 95% & Integrity Events HPL 99% & satellites used
25
Conclusions Summary of the activities from the Data Collection Network – Eurocontrol gain knowledge on how the data need to be collected, processed and analysed – EGNOS Operational Validation Methodology (3 axis) Actual performance- First glance report System Behaviour - Anomaly investigation Confidence, Extrapolation – Global Monitoring System Dynamic trials
26
Improving the network layout and the automation of procedures
Future Work Improving the network layout and the automation of procedures continuous data logging automated results data sharing - ftp Potential data exploitation and validation of the Global Monitoring System results harmonization with the network sites ones Data campaigns with the first EGNOS SIS reuse of all the ESTB lessons learned expected that the encountered anomalies rarely happen EGNOS SIS-1 confirms that
27
Questions?
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.