Download presentation
Presentation is loading. Please wait.
1
Measurement and Prediction of Dynamic Density
Parimal Kopardekar, Ph.D. Innovations Division, ACB-100 FAA Office, NASA Ames Research Center and Sherri Magyarits Simulation and Analysis Branch, ACB-330 William J. Hughes Technical Center, FAA June 25, ATM2003
2
Project Overview Definition: Dynamic Density (DD) is the effect of all factors, or variables, that contribute to sector level air traffic control complexity or difficulty at any given time Goal: To develop and validate a mathematical model that will more accurately measure and predict air traffic control complexity at the sector level than the measures available today Applications: Provide more accurate information to traffic flow managers Support more efficient staff planning Support advanced concepts Integration with decision-support tools Resectorization Shared separation between controllers and pilots
3
Research Questions Can a DD metric(s) accurately capture complexity?
Is a DD metric(s) reliable/persistent for predicting complexity starting 2 hours out? When is the DD metric(s) most reliable? Does a DD metric(s) represent complexity better than aircraft count, the basis of Monitor Alert?
4
Research Partners & Metrics
FAA (ACB-330) & Titan Systems 1 metric 7 variables Metron 10 variables NASA Ames 2 metrics 16 variables (NASA-1: Chatterji) 8 variables (NASA-2: Laudeman) MITRE Provided CRCT (Trajectory predictions)
5
Phased Approach Phase I Phase II Phase III Pilot Study at ZDV
Multi-Center on-site data collection effort Operational data Subjective data Phase III Coding of proposed metrics into strategic decision-support tool (CRCT) DD model development & testing
6
Data Overview ZTL, ZFW, ZOB, ZDV 72 thirty-minute traffic samples
72 controller & supervisor participants 36 sectors - high & low = >6400 data points: Complexity ratings & corresponding DD variable output
7
Data Overview (cont'd) 72 samples Two types of DD output
60 used for model development 12 used for model testing Two types of DD output Instantaneous DD (accuracy assessment) 30-minutes (corresponding to traffic samples) Output every 2 minutes (corresponding to same intervals at which complexity ratings were provided) Predicted DD (reliability assessment) Up to 120 minutes prior to traffic sample times Output every 2 minutes
8
Instantaneous and Predicted DD
Interval 1- 2 min Instantaneous DD (2, 4, 6, … 30 min) Interval min Predicted DD (120, 118, 116, 114, 112, 110, ………..30, 28, 26, …….4, 2, 0 min)
9
Phase III Analysis Accuracy Assessment
Objective 1: Determine how accurately the DD metrics represent the subjective complexity ratings Develop DD model (weights for different variables) Compare different DD metrics Select a ‘best-fit’ DD metric Test DD model for accuracy Objective 2: Compare accuracy of the DD metric with Monitor Alert accuracy
10
Phase III Analysis (cont'd)
Reliability Assessment Objective 1: Determine how stable the predictions are over time for the selected metric Examine DD metric prediction performance starting from 2 hours prior to traffic sample intervals Objective 2: Compare the DD metric stability with Monitor Alert stability over time
11
Phase III Results: Complexity Ratings
12
Complexity Rating Distribution
* * One controller at ZOB rated complexity using .5 increments on the 1-7 scale
13
Complexity Rating Distribution (continued)
14
Phase III Results: Instantaneous DD
15
Key Findings Most individual DD metrics perform better than aircraft count Different metrics perform better for different facilities Unified DD metric (i.e., select variables from each proposed metric) provides the best results in all conditions
16
All Facilities R values from Regression Analysis
Metrics S&C, H&L S&C, L S&C, H C, H&L S, H&L C, H S, H C, L S, L Aircraft Count 0.479 0.445 0.444 0.522 0.423 0.507 0.374 Tech Center 0.572 0.573 0.511 0.553 0.615 0.512 0.538 0.509 0.659 NASA-1 0.541 0.493 0.558 0.516 0.580 0.591 0.468 0.544 Metron 0.483 0.439 0.467 0.498 0.536 0.431 NASA-2 0.315 0.335 0.383 0.283 0.353 0.361 0.414 0.306 0.385 Unified 0.624 0.641 0.594 0.629 0.654 0.622 0.607 0.625 0.714
17
Significant Variables – Unified DD
18
DD-based model vs. Aircraft Count-based model
Complexity/CS rating = *Ac_count, R = 0.479 Complexity/CS rating = DD equation, R = 0.624
19
DD Distribution - All Ratings
DD Difference from All Ratings 7 .6 .8 17 1.6 1.9 2.7 109 10.1 12.5 15.2 243 22.5 27.8 43.0 250 23.1 28.6 71.5 185 17.1 21.1 92.7 62 5.7 7.1 99.8 2 .2 100.0 875 81.0 205 19.0 1080 -4.00 -3.00 -2.00 -1.00 .00 1.00 2.00 3.00 Total Valid System Missing Frequency Percent Valid Percent Cumulative
20
Instantaneous DD - Performance
21
Where DD performed best
1 = Controller 2 = Supervisor
22
Factor Analysis Principle Component Analysis
Because of potential interdependencies among different variables Could narrow down to 12 components from 23 variables Recommend caution since these are subjective components
23
Phase III Results: Predicted DD
24
Key Findings DD model with look-ahead time (i.e., prediction intervals) built into the equation performs better than model based on instantaneous DD only DD appears to be more stable over time than predicted number of aircraft DD appears to be more accurate over time than predicted number of aircraft Need to further consider the value of the simpler derived aircraft count model to DD
25
DD-based model with lookahead time vs. Aircraft Count-based model
(instantaneous) Complexity/CS rating = *Pred_count, R = Complexity/CS rating = DD equation + lookahead, R = 0.633
26
Stability of Predictions Over Time
27
Predicted DD - Descriptives
28
Errors as a Function of Time
1 1
29
Conclusions DD has promise – most notably as a unified metric with contributing variables from the FAA Tech Center, NASA, & Metron Model based on aircraft count also has promise (predictive part) Model development & testing has not been exhausted DD metrics could be further tested & improved using the following: Different non-linear combinations with existing Phase III data Data from real-time simulations conducted for other projects SAR, CTAS, &/or other possible raw data sources
30
Questions? Parimal Kopardekar, Innovations Division, ACB-100
Phone: Sherri Magyarits, Simulation and Analysis Branch, ACB-330 Phone:
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.