Download presentation
Presentation is loading. Please wait.
Published byReynold Walton Modified over 8 years ago
1
Developing Performance Predictive Tools for Supervisors of Teams and Complex Systems February 22 nd 2007 Yves Boussemart (yves@mit.edu) Humans & Automation Lab MIT Aeronautics and Astronautics http://halab.mit.edu
2
Outline Lab Overview Human Supervisory Control -Single vs. Multiple UVs -Supervising teams of UV operators Tools for Supervisor -TRACS -Performance Prediction -Team Environments
3
MIT Humans & Automation Lab Created in 2004 Director: Dr. Mary (Missy) Cummings Visiting professors: Dr. Gilles Coppin Visiting scientist: Dr. Jill Drury Post Doctorate Associates: Dr. Stacey Scott, Dr. Jake Crandall, Dr. Mark Ashdown Grad Students: Yves Boussemart, Sylvain Bruni, Amy Brzezinski, Hudson Graham, Jessica Marquez, Jim McGrew, Carl Nehme, Jordan Wan
4
Research and Current Projects Research in the Humans and Automation Lab (HAL) focuses on the multifaceted interactions of human and computer decision-making in complex socio-technical systems. -Time-Sensitive Operations for Distributed Teams -Human Supervisory Control Issues of Multiple Unmanned Vehicles (Reduced manning) -Measurement Technologies for Human-UV Teams -Collaborative Human Computer Decision Making -Integrated Sensor Decision Support Sponsors: Office of Naval Research, Boeing, Lincoln Labs, AFOSR, Thales
5
HAL Testing Equipment Single Operator Testing: ONR’s Multi-modal Watch Station (MMWS) Team Testing: HAL Complex Operation Center ONR Mobile Testing Lab
6
Outline Lab Overview Human Supervisory Control -Single vs. Multiple UVs -Supervising teams of UV operators Tools for Supervisor -TRACS -Performance Prediction -Team Environments
7
Task Displays Human Operator (Supervisor) Computer Actuators Sensors Controls Human Supervisory Control Humans on the loop vs. in the loop Supporting knowledge-based versus skill-based tasks Network-centric operations & cognitive saturation Human Supervisory Control (HSC)
8
Human-Supervisory Control of Automated Systems Process Control Unmanned Vehicle Operations Satellite Operations Manned Aviation (Mars rover) (Shadow UAV)
9
Major Research Area: HSC of Unmanned Vehicles Shadow UAV Predator UAV Packbot UGV Spotter UGV Unmanned Aerial Vehicles (UAVs) Unmanned Ground Vehicles UGVs (i.e., Robots) Unmanned Undersea Vehicles (UUVs) VideoRay UUV Odyssey UUV
10
Major Research Area: HSC of Unmanned Vehicles Packbot UGV Spotter UGV Unmanned Ground Vehicles UGVs (i.e., Robots) Unmanned Undersea Vehicles (UUVs) VideoRay UUV Odyssey UUV Shadow UAV Predator UAV Unmanned Aerial Vehicles (UAVs)
11
Motivation: Increasing Reliance on UAVs in Military Operations UAVs are becoming an essential part of modern military operations Typical UAV missions include: -Force protection -Intelligence, surveillance, and reconnaissance (ISR) -Combat search and rescue -Strike coordination and reconnaissance (SCAR) Predator UAV Predator Ground Control Station
12
Inverting the Operator/Vehicle Ratio Current UAV Operations 1 UAV : 2-5 Operators Semi-Autonomous UAV Operations 2-5 UAVs : 1 Operator Future UAV Teams
13
Developed large-screen supervisor displays that provide current and expected mission and task progress information of team assets and operator activity Displays integrate related information and provides emergent features for time-critical data Current Supervisory-Level Decision Support for Teams
14
Supervisory Information? Individual and Team performances -Stress & time pressure -Rapidly evolving situation Actions: -Adaptive automation -Operator replacement / shifts Excessive workload
15
Towards Performance Prediction Tools 4 step process: 1.Tracking of individual actions 2.Pattern recognition on strategies and performance prediction 3.Aggregation of individual data and collaboration factors 4.Team level performance predictions Is the Operator using “good” strategies? Operator Is the team doing well? Supervisor
16
Outline Lab Overview Human Supervisory Control -Single vs. Multiple UVs -Supervising teams of UV operators Tools for Supervisor -TRACS -Performance Prediction -Team Environments
17
2-dimensional space: Level of Information Detail (LOID) Mode (action steps) 4 quadrants: LOID: higher vs. lower automation/information Mode: evaluation vs. generation of solutions Technology disclosure for patent and licensing Tracking Resource Allocation Cognitive Strategies (TRACS)
18
Application: Decision-Support for Tomahawk Land Attack Missile (TLAM) Strike Planning Resource allocation task: Match resources (missiles) with objectives (missions) Respect Rules of Engagement Satisfy multivariate constraints Current system: PC-MDS, no decision-support 3 interfaces at various levels of collaboration Example of TRACS Application
19
TRACS applied to TLAM LOID: Higher automation: Group of criteria Individual criterion Lower automation: Group of matches Individual match Data cluster Data item Mode: Evaluation: Evaluate, Backtrack Generation: Browse Search Select Filter Automatch Example of TRACS Representation
20
Mostly Manual (Interface 1) Mostly Automation (Interface 3) Combination (Interface 2) Example of TRACS Results Cognitive strategies are emerging as patterns
21
Outline Lab Overview Human Supervisory Control -Single vs. Multiple UVs -Supervising teams of UV operators Tools for Supervisor -TRACS -Performance Prediction -Team Environments
22
TRACS as a observable data of Hidden Markov Model for individual users Compute the decision transition matrices from empirical data Bayesian Prediction based on Markov Chains Performance Prediction with TRACS
23
TRACS + Neural Networks: Detect pattern with neural network: cognitive strategies Alert supervisor when behavior degrades Are bad performances robustly predictable in advance? Manual to automatch Manual Browsing Automatch loop Performance Prediction with TRACS
24
Outline Lab Overview Human Supervisory Control -Single vs. Multiple UVs -Supervising teams of UV operators Tools for Supervisor -TRACS -Performance Prediction -Team Environments
25
UAV Team Operations
26
Collaboration Factors Individual Tracking cognitive strategies Performance predictions - Relatively simple metrics Team Team dynamics Intra-Team communication - Verbal & Non-Verbal Performance metrics Group Awareness Situation and activity awareness Distributed cognition Group Interaction Theories Open Research Questions
27
Critical Questions to Consider What metrics can we use to gauge team performance? -Which factors drive the metric? How does time pressure affect the decision process? How much information does a supervisor need? -Direct observation of operators’ behavior -Synthetic data only (TRACS)? -Both?
28
Summary Focus went from individual UAV operator to supervisor of teams of UAV operators Proposing a performance predictive tool Extend the predictions to team environments
29
Questions?
30
Research supervised by Prof. M. L. Cummings Research effort sponsored by Boeing/Boeing Phantom Works Contacts: yves@mit.edu missyc@mit.eduyves@mit.edumissyc@mit.edu Web: http://halab.mit.eduhttp://halab.mit.edu TRACS demo: http://web.mit.edu/aeroastro/www/labs/halab/media.html http://tinyurl.com/ybafp2
31
Backup Slides
32
Interface 1 - manual LOA 2 - manual matching Basic support: filtering, sorting, warning and summarizing
33
Interface 2 - collaborative LOA 3 - collaborative matching Advanced features for interactive search of a solution manual matching “automatch” = customizable heuristic search algorithm graphical summaries of constraint satisfaction option to save for comparison purposes
34
Interface 3 - configural LOA 4 - automated matching with configural display High level, constrained solution search no access to raw data aggregated info. only possibility to tweak the solution or to force assignment
35
Tomahawk Mission Planning Performance on incomplete scenario performance decreased when LOA increased on single interface setup best: interface 1 and interfaces 2&3 - worst: interfaces 1&3 no deviation on interface 3 Interface 1: P = 69.7 Interface 3: P = 68.5
37
Problems with a 3D visualization Loss of granularity and clutter Occlusion effect (loss of 2D information) Parallax effect (detrimental perspective) Difficult to manipulate (high cognitive load) Difficult to orient oneself (loss of SA) Lack of emergent temporal analysis feature TRACS 3D
38
From TRACS 3D to TRACS 2.5D Temporal Data TRACS 3D: orthogonal axis TRACS 2.5D: interactive timeline Advantages Not 3D (occlusion, parallax, orientation problems addressed) Familiar manipulation Clear grouping of temporal features (granularity, clutter, emergent properties)
39
TRACS 2.5D
40
Humans and Automation Laboratory Mobile Advanced Command and Control Station
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.