Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab.

Slides:



Advertisements
Similar presentations
ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Advertisements

Web Service Ahmed Gamal Ahmed Nile University Bioinformatics Group
Lecture 8: Three-Level Architectures CS 344R: Robotics Benjamin Kuipers.
Chapter 19: Network Management Business Data Communications, 5e.
Chapter 19: Network Management Business Data Communications, 4e.
Object-Oriented Analysis and Design
Network Management Overview IACT 918 July 2004 Gene Awyzio SITACS University of Wollongong.
1 SAFIRE Project DHS Update – July 15, 2009 Introductions  Update since last teleconference Demo Video - Fire Incident Command Board (FICB) SAFIRE Streams.
Think. Learn. Succeed. Aura: An Architectural Framework for User Mobility in Ubiquitous Computing Environments Presented by: Ashirvad Naik April 20, 2010.
Ambient Computational Environments Sprint Research Symposium March 8-9, 2000 Professor Gary J. Minden The University of Kansas Electrical Engineering and.
1 ITC242 – Introduction to Data Communications Week 12 Topic 18 Chapter 19 Network Management.
Experiences with an Architecture for Intelligent Reactive Agents By R. Peter Bonasso, R. James Firby, Erann Gat, David Kortenkamp, David P Miller, Marc.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
Managing Agent Platforms with the Simple Network Management Protocol Brian Remick Thesis Defense June 26, 2015.
CS 290C: Formal Models for Web Software Lecture 6: Model Driven Development for Web Software with WebML Instructor: Tevfik Bultan.
Robotics for Intelligent Environments
Intelligent Agents: an Overview. 2 Definitions Rational behavior: to achieve a goal minimizing the cost and maximizing the satisfaction. Rational agent:
The Need of Unmanned Systems
Course Instructor: Aisha Azeem
Copyright Arshi Khan1 System Programming Instructor Arshi Khan.
An Intelligent Broker Architecture for Context-Aware Systems A PhD. Dissertation Proposal in Computer Science at the University of Maryland Baltimore County.
Katanosh Morovat.   This concept is a formal approach for identifying the rules that encapsulate the structure, constraint, and control of the operation.
L C SL C S Supporting Technology for Group Interaction Howard Shrobe MIT AI Lab Oxygen Workshop, January, 2002.
Chapter 6 System Engineering - Computer-based system - System engineering process - “Business process” engineering - Product engineering (Source: Pressman,
An approach to Intelligent Information Fusion in Sensor Saturated Urban Environments Charalampos Doulaverakis Centre for Research and Technology Hellas.
An Introduction to Software Architecture
CS525: Special Topics in DBs Large-Scale Data Management Hadoop/MapReduce Computing Paradigm Spring 2013 WPI, Mohamed Eltabakh 1.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
© 2007 Tom Beckman Features:  Are autonomous software entities that act as a user’s assistant to perform discrete tasks, simplifying or completely automating.
Spoken dialog for e-learning supported by domain ontologies Dario Bianchi, Monica Mordonini and Agostino Poggi Dipartimento di Ingegneria dell’Informazione.
SOFTWARE DESIGN AND ARCHITECTURE LECTURE 09. Review Introduction to architectural styles Distributed architectures – Client Server Architecture – Multi-tier.
Patterns and Reuse. Patterns Reuse of Analysis and Design.
ARTIFICIAL INTELLIGENCE [INTELLIGENT AGENTS PARADIGM] Professor Janis Grundspenkis Riga Technical University Faculty of Computer Science and Information.
The roots of innovation Future and Emerging Technologies (FET) Future and Emerging Technologies (FET) The roots of innovation Proactive initiative on:
Page 1 WWRF Briefing WG2-br2 · Kellerer/Arbanowski · · 03/2005 · WWRF13, Korea Stefan Arbanowski, Olaf Droegehorn, Wolfgang.
Advanced Computer Networks Topic 2: Characterization of Distributed Systems.
FOREWORD By: Howard Shrobe MIT CS & AI Laboratory
An Ontological Framework for Web Service Processes By Claus Pahl and Ronan Barrett.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
University of Windsor School of Computer Science Topics in Artificial Intelligence Fall 2008 Sept 11, 2008.
L C SL C S Reactive and Responsive Intelligent Environments Kevin Quigley aire group MIT AI Lab.
31 March 2009 MMI OntDev 1 Autonomous Mission Operations for Sensor Webs Al Underbrink, Sentar, Inc.
Automatic Trust Management for Adaptive Survivable Systems (ATM for ASS’s) Howard Shrobe MIT AI Lab Jon Doyle MIT Lab for Computer Science.
Chapter 1. Cognitive Systems Introduction in Cognitive Systems, Christensen et al. Course: Robots Learning from Humans Park, Sae-Rom Lee, Woo-Jin Statistical.
16/11/ Semantic Web Services Language Requirements Presenter: Emilia Cimpian
Autonomy for General Assembly Reid Simmons Research Professor Robotics Institute Carnegie Mellon University.
Course: COMS-E6125 Professor: Gail E. Kaiser Student: Shanghao Li (sl2967)
SelfCon Foil no 1 Variability in Self-Adaptive Systems.
Agent Overview. Topics Agent and its characteristics Architectures Agent Management.
Hadoop/MapReduce Computing Paradigm 1 CS525: Special Topics in DBs Large-Scale Data Management Presented By Kelly Technologies
Design-Directed Programming Martin Rinard Daniel Jackson MIT Laboratory for Computer Science.
1 An infrastructure for context-awareness based on first order logic 송지수 ISI LAB.
Semantic Web in Context Broker Architecture Presented by Harry Chen, Tim Finin, Anupan Joshi At PerCom ‘04 Summarized by Sungchan Park
Intelligent Agents Chapter 2. How do you design an intelligent agent? Definition: An intelligent agent perceives its environment via sensors and acts.
Control-Theoretic Approaches for Dynamic Information Assurance George Vachtsevanos Georgia Tech Working Meeting U. C. Berkeley February 5, 2003.
W3C Multimodal Interaction Activities Deborah A. Dahl August 9, 2006.
L C SL C S Supporting Technology for Group Interaction Howard Shrobe MIT AI Lab Oxygen Workshop, January, 2002.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
Chapter 19: Network Management
Service-Oriented Computing: Semantics, Processes, Agents
Monitoring Dynamical Systems: Combining Hidden Markov Models and Logic
Intelligent Agents Chapter 2.
Service-Oriented Computing: Semantics, Processes, Agents
MANAGING KNOWLEDGE FOR THE DIGITAL FIRM
Model-View-Controller Patterns and Frameworks
An Introduction to Software Architecture
Service-Oriented Computing: Semantics, Processes, Agents
MIT AI Lab: B. Williams, H. Shrobe, R. Laddaga
Presented By: Darlene Banta
Presentation transcript:

Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab

Structure of the MIT Project Two Major Design Foci –Autonomous Vehicles Building on Brian Williams’ Work at NASA Ames 1st Generation Software flown on Deep Space 1 Extending the modeling framework to support hybrid systems New mode-identification and mode-control algorithms –Perceptually Enabled Spaces Building on the MIT AI Lab’s Intelligent Room Emphasis on self-adaptivity, recovery from faults and attacks Use of Machine Vision, Speech and NLP New Emphasis on modeling, frameworks Common Themes –Self-diagnosis, recovery, “domain architecture frameworks” –Integration through model-driven online inference –Multiplicity of methods for common abstract tasks Extension to the MOBIES OEP’s.

Model-based Integration of System Interactions Through Online Deduction monitoringmonitoring tracking goalstracking goals confirming commandsconfirming commands isolating faultsisolating faults diagnosing faultsdiagnosing faults reconfiguring hardwarereconfiguring hardware coordinating control policiescoordinating control policies recovering from faultsrecovering from faults avoiding failuresavoiding failures allocate resourcesallocate resources select execution timesselect execution times select proceduresselect procedures order actionsorder actions generate successor generate successor TMS conflict database conflict database Online Propositional Deduction Controller Plant mode identification mode control s’(t)  (t) f f s (t) g g Model Goals Model-based Deductive Executive Software Component Control templates Models of physical constituents Model-based Programs

Model Based Troubleshooting GDE Times Plus Conflicts: Diagnoses: Blue or Violet Broken Green Broken, Red with compensating fault Green Broken, Yellow with masking fault 15 25

Consistent Diagnoses ABCMIDMIDProbExplanation LowHigh NormalNormalSlow C is delayed SlowFastNormal A Slow, B Masks runs negative! FastNormalSlow A Fast, C Slower NormalFastSlow B not too fast, C slow FastSlowSlow A Fast, B Masks, C slow SlowFastFast A Slow, B Masks, C not masking fast LHP Normal:3 6.7 Fast: Slow:730.2 IN 0 LHP Normal: Fast: Slow: OUT2 Observed:17 Predicted:Low = 8 High =16 LHP Normal:240.9 Fast: Slow: OUT1 Observed:5 Predicted:Low = 5 High = 10 A B C MID Low = 3 High = 6 Applying Failure Models

Modeling Reactive Fallible Systems Create a new modeling reactive programming language to describe the behavior of the combined hardware-software system Underlying semantics is that of a (partially observable) Markov Model Mode Identification (diagnosis) is now the problem of state estimation in a HMM

Normal: Delay: 2,4 Delayed: Delay 4,+inf Accelerated: Delay -inf,2 Node17 Located On Normal: Probability 90% Parasite: Probability 9% Other: Probability 1% Component 1 Has models Conditional probability =.2 Conditional probability =.4 Conditional probability =.3 Moving to a Multi-Tiered Bayesian Framework The model has two levels of detail specifying computations, the underlying resources and the mapping of computations to resources Each resource has models of its state of compromise The modes of the resource models are linked to the modes of the computational models by conditional probabilities The Model can be viewed as a Bayesian Network

Summary of Autonomous Vehicle Work To survive decades, autonomous systems must orchestrate complex regulatory and immune systems. Future systems will be programmed with models, describing themselves and their environments. Future runtime kernels will be agile, deducing and planning from these models within the reactive loop. This requires a new foundation for embedded computation, replacing discrete automata with partially observable Markov decision processes. We propose to extend model-based programming to dynamic domain specific languages with distributed model-based executives that reason over complex, functionally redundant behaviors.

Software Frameworks for Embedded Systems A framework reifies a model in code, API, and in the constraints and guarantees the model provides. It includes: A set of properties & formal ontology of the domain of concern An axiomatization of the core domain theory Analytic (e.g. proof) techniques tailored to these properties and their domain theory A run-time infrastructure providing a rich set of layered services Models describing the goal-directed structure of these software services A protocol specifying the rules by which other software interacts with the infrastructure provided by the framework An domain-specific extension language for coupling an application to the services rendered by the framework.

Frameworks Support Analysis The model is specified in terms of a domain specific extension language that captures the terms and concepts used by application domain experts. The ontology provides a language in which to state annotations of the program (e.g. goals, alternative strategies and methods for achieving goals, sub-goal structure, state-variables, declarations, assertions, and requirements). Annotations inform program analysis. –Both logical and probabilistic. Annotations facilitate the writing of high level generators –Synthesizing the wrapper code that integrates multiple frameworks.

Dynamic Domain Architecture Frameworks Structures the procedural knowledge of the domain into layers of services Services at one layer invoke services from lower layers to achieve their sub-goals. Each service has many implementations corresponding to the variability and parameterization of the domain. The choice of implementation to invoke is made at runtime, in light of runtime conditions, with the goal of maximizing expected utility. Exposes its models, goal structure, state-variables, its API, its protocol of use and constraints on those subsystems that interact with it.

DDA Framework Super routines Layer1 Layer2 Layer3 Post Condition 1 of Foo Because Post Cond 2 of B And Post Cond 1 of C PreReq 1 of B Because Post Cond 1 of A A B C Foo Synthesized Sentinels Development EnvironmentRuntime Environment Diagnostic Service Repair Plan Selector Resource Allocator alerts A B Condition-1 Self Monitoring Rollback Designer Enactment Plan Structures Component Asset Base 123 Foo 123 B 123 A Method 3 Is most Attractive 123 To: Execute Foo Rational Selection Diagnosis & Recovery

Integration of DDA Frameworks Frameworks interact at runtime by observing and reasoning about one another's state and by posting goals and constraints to guide each other's behaviors. The posting and observation of state is facilitated by wrapper code inserted into each framework by model- based generators of the interacting frameworks. Use of generated observation, control points and novel, fast propositional reasoning techniques allow this to happen within reactive time frames. The composite system behaves as if it is goal directed while avoiding the overhead normally associated with generalized reasoning.

Perceptually Enabled Spaces The Intelligent Room is an Integrated Environment for Multi-modal HCI. It has Eyes and Ears.. –The room provides speech input –The room has deep understanding of natural language utterances –The room has a variety of machine vision systems that enable it to: Track motion and maintain the position of people Recognize gestures Recognize body postures Identify faces (eventually) Track pointing devices (e.g. laser pointer) Select Optimal Camera for Remote Viewers Steer Cameras to track focus of attention –Perceptually enabled environments are good surrogates for sensor driven DoD applications (e.g. missile defense).

Command Post Demo (2 years old)

MetaGlue:A Platform for Perceptually Enabled Environments Naming and Discovery –Society, Role within Society, Required Properties –Societies are collection that act on behalf of a common entity (person or space) Central Registry –Discovery –Environmental Specific info –Storage of Agent State Communication –Direct Method Call (using RMI) Robustness –Freezing of State and Thawing on Automatic restart of dead-agents Dynamic Reloading of Agents during system execution Dynamic Collaboration Between Agents –Publish and Subscribe driven event interfaces

Agents Currently Provided in MetaGlue Control of Devices –X10 and similar simple sensors and effectors –Audio visual equipment and multiplexers Display and Screen Management Speech recognition components –Contextual grammars and command processing Visual processing –Laser pointer tracking –Face tracking for video conferencing Natural language processing –Interfaces to the start system

START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor

The Need for Frameworks MetaGlue is a lightweight, distributed object infrastructure for perceptually enabled systems. Meta-Glue provides tools for integrating and dynamically connecting components, extensibility and for saving and restoring the state of components. The current incarnation of the system deals with the key modeling challenges with ad croc techniques. In MOBIES we are developing principled frameworks for these modeling challenges. Each framework provides languages, interfaces and guarantees for a specific set of concerns. These frameworks deal with semantics, context, resource management and robustness.

5 Key Modeling challenges How to model people, processes and perceptions so as to enable group interactions in multiple spaces How to model services so as to enable reasonably optimal use of resources How to model processes so as to recover from failures (e.g. equipment breakdown, failed assumptions) How to model perceptual events so as to coordinate and fuse information from many sensors How to model and exploit context

Grounding in Semantics We want to build applications that simultaneously service multiple people, organizations, physical spaces, sensors and effectors. The individuals move within and among many physical spaces The roles and responsibilities of individuals changes over time. The devices and resources they use change as time progresses The context shifts during interactions The relevant information base evolves over time.

Models and Knowledge Representations People –Interests, Skills, Responsibilities, Organizational Role Organizations –Members, structure, roles, processes and procedures. Spaces –Location, Subspaces –Devices and Resources contained in the space. Services: Methods, parameter bindings, resource requirements Agents: Capabilities, interfaces, society. Resources: Interfaces, capabilities, cost, reliability Information Nodes: Topic area, place in ontology, format Events –Changes in any of the above representations or in the system’s knowledge about any of the above. –E.g. person identification, Motion in space.

Abstracting Away from Specific Devices Until recently, applications were written in terms of specific resources without context (e.g. the left projector). This conflicts with: –Portability across physical contexts –Changes in equipment availability across time –Multiple applications demanding similar resources –Need to take advantage of new resources –Need to integrate mobile devices as they migrate into a space –Need to link two or more spaces What is required is a more abstract approach, a framework for resource management, in which no application needs to be tied to a specific device.

Framework 1: Service Mapping and Resource Management Users request abstract services from the Service Mapper –“I want to get a textual message to a system wizard” The Service Mapper has many plans for how to render each service –“Locate a wizard, project on a wall near her” –“Locate a wizard, use a voice synthesizer and a speaker near her” –“Print the message and page the Wizards to go to the printer” Each plan requires certain resources (and other abstract services) –Some resources are more valuable, scarce, utilized than others –The Resource Manager places a “price” on each resource Each plan provides the service with different qualities –Some of these are more desired by the user (higher benefit) The Service Mapper picks a plan which is (nearly) optimal –Maximum net benefit

Abstract Service Control Parameters User’s Utility Function The binding of parameters has a value to the user Resource 1,1 Resource 1,2 Resource 1,j Each Method Requires Different Resources The System Selects the Method Which Maximizes Net Benefit User Requests A Service with certain parameters Resource Cost Function The Resources Used by the Method Have a cost Net Benefit Each Method Binds the Settings of The Control Parameters in a Different Way Method 1 Method 2 Method n Each Service can be Provided by Several Methods Service Mapping and Resource Management

A Service Mapping Example I need to ask a question of a Systems Wizard Plan 1: Locate A Systems Wizard, Project on a wall near her Costs: Project (in use) high, person Location (high) Benefits: Fast, Clear Plan 2: Locate A Systems Wizard, Voice Synthesize on nearby speaker Costs: Load-Speaker (unused) mid, person location (high) Benefit: Fast, Catches attention Plan 3: Print on printer, Send a page on the Wizard Line Costs: Printer (busy) mid, pager (mid) Benefit: Mid Go to Printer The best plan under the Circumstances is Plan 3!

Framework 2: Recovery From Failures The Service Mapping framework renders services by translating them into plans involving physical resources Physical resources have known (and unknown) failure modes Each plan step accomplishes sub-goal conditions needed by succeeding steps –Each condition has some way of monitoring whether it has been accomplished –These monitoring steps are generated into the code implementing the plan If a sub-goal fails to be accomplished, the diagnostic infrastructure is invoked

Making the System Responsible for Achieving Its Goals Scope of Recovery Selection of Alternative Localization & Characterization Diagnostic Service Repair Plan Selector Resource Allocator Concrete Repair Plan Resource Plan alerts achieves requires A B Condition-1 prerequisite Monitor Rollback Designer Enactment

Diagnosis and Recovery Framework Model-based diagnosis isolates and characterizes the failure –Driven by detected discrepancies between expectations and observations –Each component has models of both normal and abnormal modes of operation –Selects set of components models consistent with observations A recovery is chosen based on the diagnosis –It might be as simple as “try it again”, we had a network glitch –It might be “try it again, but with a different selection of resources” –It might be as complex as “clean up and try a different plan”

Example of Recovering from Failures I don’t see light on the screen I see a wizard by the screen Locate a Wizard by the Screen Monitoring: check that wizard is still there Project the message Monitoring: Check that the person noticed the message Turn on selected projector Monitoring: Check that the projector is on The projector-1 must be broken We’ll try again, but use Projector-3 Plan Breakdown

Framework 3: Coordination of Perceptual Information We wish to separate the implementation of perceptual tasks from the uses to which perception is put Communication between modules focuses on “behavioral events” –An event is signaled whenever the state of any object in the model is perceived to change –E.g. a person moves, a person is identified, a device dies Communication is controlled by publish-subscribe interface –Each module publishes in central registry which events it can notice –Each module subscribes with registry for events of interest –Registry distributes to signalers list of consumers More exactly code to test list of consumers No centralized communications hub

The Event “Bus” For Perceptual Coordination Visual Tracke r I signal body motion Voice Identification I’m interested in body motion I signal the location of individuals Face Recognition I’m interested in face location I signal the location of individuals White Board Context Manager I’m interested in the location of individuals I signal people approaching the whiteboard Face Spotter I’m interested in body motion I signal face location

Coordination of Perceptual Information “Behavioral events” are organized into a taxonomy –Some behaviors are “close to the physics” –Some behaviors are more abstract (e.g. a person is near the white-board) Events are signaled with a certainty estimate Behaviors abstract over time –Recognizers take the form of Markov chains The same event can be signaled by different perceptual modules –both face and voice recognition can identify a person Recognizer code is synthesized from logical descriptions

Interaction of Frameworks: Perceptual Integration through Service Request For each behavior there is a corresponding Service This can be requested through the Service Mapping framework –Requested when the initial perception lacks confidence –Driven by diagnosis of perceptual breakdown Cause the system to marshal resources necessary to gather the additional information needed for disambiguation There is a body Certainty: high It’s Sally Certainty:very low There is a face Certainty:high Diagnosis: Bad Pose Recovery: Get Good Pose Service Request: Get Good Pose Plan: Synthesize Good Pose Get another view Visual Hull Interpolate It’s Sue Certainty:high

Framework 4: Contextual management of perception time = i time = i+1  context Context is a combination of: Information from all perceptual modalities Task Structure Context biases perceptual processing speech vision (eventually) perception Task structure

Using Context to Guide Speech Recognition Based on IBM’s ViaVoice Large assembly (~ 50) of software agents + + Many small grammars Each Appropriate to a Specific Context Dynamically activate & deactivate based on the interaction Simulate a very large set of supported utterances Reject utterances inappropriate to the Context + +

An Example of Context Management The attention of the system should be focussed by what people do and say. Speech recognition should be biased in favor of things going on at that time –Speech system is made up of many “grammar fragments” –Grammar fragments are activated (and deactivated) when perceptual events (visual or speech) suggest they should be Frobulate the sidetracker Activate the Drawing Grammar If Person ?P approaches space ?x and Grammar ?y is relevant to ?x Then Activate Grammar ?y Sally approaches space Whiteboard-1 White Board Place Manager Space Whiteboard-1 contains device mimeo-1 Mimeo devices are drawing devices The Drawing Grammar Is Relevant to space Whiteboard-1 If A space ?x contains a drawing device ?y Then the Drawing Grammar is relevant to space ?x

Framework 5: Application Frameworks for Display and Cmd Mgt An application manages: A set of displays A set of perceptual capabilities An underlying set of state-variables (the root nodes of its representation) Application loop: Accept a service request (a command) Perform the service Update the displays to reflect new state-variables Gives a sense of synchronicity at the high level even though at the lower level there are many distributed agents at work

Elements of Application Framework Display Manager State Variables Display

Summary First focus is on model-driven integration of autonomous vehicles –New modeling, and mode-identification techniques being developed Second focus is on perceptually guided environments –Several frameworks being developed to support specific issues –Service Mapping, Perceptual Coordination, Diagnosis and recovery, Contextual Biasing Common theme of structuring into frameworks along the lines of Dynamic Domain Architectures. Will soon investigate OEP domains as well.