Case Study on requirements, design, and evaluation: NATS Chapter 19 Case Study on requirements, design, and evaluation: NATS
Introduction FAST: Final Approach Spacing Tool An air traffic control system Developed for UK National Air Traffic Services (NATS) Controller’s tasks: analyze data from radar screen communicate to pilot
FAST Designed to automate calculation of approach timings User-centered iterative design iterating between design and evaluation Key concern: spacing between aircraft FAST was developed to help calculate this spacing and advise controllers when aircraft should turn to final
Example radar display Not the real radar display--just something I found on the web
Final approach control Radar display shows aircraft that should be flying in their flight corridors, according to their flight plans Flights get passed between controllers Physical strips are used to pass along flight info---these are color-coded indicating size Size determines spaces: can’t have a small aircraft fly into the wake vortex of a larger one
FAST UI Improved radar display: Touch panel: runway lines added landing sequence boxes star and diamonds representing aircraft small white number indicating seconds to imminent turn to final (the FAST output) Touch panel: input to FAST (e.g., wind, visibility, minimum spacing required)
Points to ponder If users enter minium spacing required into FAST, what does FAST calculate? How else could this be accomplished? how about a dB of aircraft--there are only so many makes and models? one could hardcode the minimum separation distances and simply look them up alternatively, calculate based on aircraft dimensions and weight…
FAST Development Project team: 5 core members, interdisciplinary (CS + HF + user): usability practitioner software developer algorithm developer (eh? what’s the diff?) requirements engineer (what’s that?) manager
Requirements Gathering Users: 23-50 years old; well paid and motivated Tasks (briefly): analyze data from radar screen communicate to pilot Environment: well-lit, noisy, could be stressful Requirements gathering: observations questionnaires
UI Design Prototypes used: low-fidelity (paper) initially mid-fielity (powerpoint with animation) high-fidelity (C++ simulator basically) Note the investment here: writing a simulator to test the GUI may seem prohibitively expensive however, some algorithms were re-used (e.g., separation calculations presumably)
Planning the Evaluation Start with a plan (e.g., experimental design “recipe”: subjects, i.v.’s, d.v.’s, procedures, etc.) Select users (representative sample) Simulate environment (or test in vivo?) Plan the scenarios, be ready for users Collect data
Evaluate Collect data on SEE (Satisfaction, Efficiency, Effectiveness), if possible FAST: looks like mainly concerned with satisfaction, i.e., subjective data, self-assessment Analyze data (here, mainly questionnaires) Apply results to re-design and iterate Report results (to customers, or write a SIGCHI paper :)