Presentation is loading. Please wait.

Presentation is loading. Please wait.

Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab.

Similar presentations


Presentation on theme: "Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab."— Presentation transcript:

1 Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab

2 Structure of the MIT Project Two Major Design Foci –Autonomous Vehicles Building on Brian Williams’ Work at NASA Ames 1st Generation Software flown on Deep Space 1 Extending the modeling framework to support hybrid systems New mode-identification and mode-control algorithms –Perceptually Enabled Spaces Building on the MIT AI Lab’s Intelligent Room Emphasis on self-adaptivity, recovery from faults and attacks Use of Machine Vision, Speech and NLP New Emphasis on modeling, frameworks Common Themes –Self-diagnosis, recovery, “domain architecture frameworks” –Integration through model-driven online inference –Multiplicity of methods for common abstract tasks Extension to the MOBIES OEP’s.

3 Model-based Integration of System Interactions Through Online Deduction monitoringmonitoring tracking goalstracking goals confirming commandsconfirming commands isolating faultsisolating faults diagnosing faultsdiagnosing faults reconfiguring hardwarereconfiguring hardware coordinating control policiescoordinating control policies recovering from faultsrecovering from faults avoiding failuresavoiding failures allocate resourcesallocate resources select execution timesselect execution times select proceduresselect procedures order actionsorder actions generate successor generate successor TMS conflict database conflict database Online Propositional Deduction Controller Plant mode identification mode control s’(t)  (t) f f s (t) g g Model Goals Model-based Deductive Executive Software Component Control templates Models of physical constituents Model-based Programs

4 Model Based Troubleshooting GDE Times Plus 3 5 3 5 5 40 35 40 Conflicts: Diagnoses: 25 20 Blue or Violet Broken Green Broken, Red with compensating fault Green Broken, Yellow with masking fault 15 25

5 Consistent Diagnoses ABCMIDMIDProbExplanation LowHigh NormalNormalSlow33.04410C is delayed SlowFastNormal712.00640A Slow, B Masks runs negative! FastNormalSlow12.00630A Fast, C Slower NormalFastSlow46.00196B not too fast, C slow FastSlowSlow-300.00042A Fast, B Masks, C slow SlowFastFast1330.00024A Slow, B Masks, C not masking fast LHP Normal:3 6.7 Fast:-302.1 Slow:730.2 IN 0 LHP Normal:5100.8 Fast:-304.03 Slow:1130.07 OUT2 Observed:17 Predicted:Low = 8 High =16 LHP Normal:240.9 Fast:-301.04 Slow:530.06 OUT1 Observed:5 Predicted:Low = 5 High = 10 A B C MID Low = 3 High = 6 Applying Failure Models

6 Modeling Reactive Fallible Systems Create a new modeling reactive programming language to describe the behavior of the combined hardware-software system Underlying semantics is that of a (partially observable) Markov Model Mode Identification (diagnosis) is now the problem of state estimation in a HMM

7 Normal: Delay: 2,4 Delayed: Delay 4,+inf Accelerated: Delay -inf,2 Node17 Located On Normal: Probability 90% Parasite: Probability 9% Other: Probability 1% Component 1 Has models Conditional probability =.2 Conditional probability =.4 Conditional probability =.3 Moving to a Multi-Tiered Bayesian Framework The model has two levels of detail specifying computations, the underlying resources and the mapping of computations to resources Each resource has models of its state of compromise The modes of the resource models are linked to the modes of the computational models by conditional probabilities The Model can be viewed as a Bayesian Network

8 Summary of Autonomous Vehicle Work To survive decades, autonomous systems must orchestrate complex regulatory and immune systems. Future systems will be programmed with models, describing themselves and their environments. Future runtime kernels will be agile, deducing and planning from these models within the reactive loop. This requires a new foundation for embedded computation, replacing discrete automata with partially observable Markov decision processes. We propose to extend model-based programming to dynamic domain specific languages with distributed model-based executives that reason over complex, functionally redundant behaviors.

9 Software Frameworks for Embedded Systems A framework reifies a model in code, API, and in the constraints and guarantees the model provides. It includes: A set of properties & formal ontology of the domain of concern An axiomatization of the core domain theory Analytic (e.g. proof) techniques tailored to these properties and their domain theory A run-time infrastructure providing a rich set of layered services Models describing the goal-directed structure of these software services A protocol specifying the rules by which other software interacts with the infrastructure provided by the framework An domain-specific extension language for coupling an application to the services rendered by the framework.

10 Frameworks Support Analysis The model is specified in terms of a domain specific extension language that captures the terms and concepts used by application domain experts. The ontology provides a language in which to state annotations of the program (e.g. goals, alternative strategies and methods for achieving goals, sub-goal structure, state-variables, declarations, assertions, and requirements). Annotations inform program analysis. –Both logical and probabilistic. Annotations facilitate the writing of high level generators –Synthesizing the wrapper code that integrates multiple frameworks.

11 Dynamic Domain Architecture Frameworks Structures the procedural knowledge of the domain into layers of services Services at one layer invoke services from lower layers to achieve their sub-goals. Each service has many implementations corresponding to the variability and parameterization of the domain. The choice of implementation to invoke is made at runtime, in light of runtime conditions, with the goal of maximizing expected utility. Exposes its models, goal structure, state-variables, its API, its protocol of use and constraints on those subsystems that interact with it.

12 DDA Framework Super routines Layer1 Layer2 Layer3 Post Condition 1 of Foo Because Post Cond 2 of B And Post Cond 1 of C PreReq 1 of B Because Post Cond 1 of A A B C Foo Synthesized Sentinels Development EnvironmentRuntime Environment Diagnostic Service Repair Plan Selector Resource Allocator alerts A B Condition-1 Self Monitoring Rollback Designer Enactment Plan Structures Component Asset Base 123 Foo 123 B 123 A Method 3 Is most Attractive 123 To: Execute Foo Rational Selection Diagnosis & Recovery

13 Integration of DDA Frameworks Frameworks interact at runtime by observing and reasoning about one another's state and by posting goals and constraints to guide each other's behaviors. The posting and observation of state is facilitated by wrapper code inserted into each framework by model- based generators of the interacting frameworks. Use of generated observation, control points and novel, fast propositional reasoning techniques allow this to happen within reactive time frames. The composite system behaves as if it is goal directed while avoiding the overhead normally associated with generalized reasoning.

14 Perceptually Enabled Spaces The Intelligent Room is an Integrated Environment for Multi-modal HCI. It has Eyes and Ears.. –The room provides speech input –The room has deep understanding of natural language utterances –The room has a variety of machine vision systems that enable it to: Track motion and maintain the position of people Recognize gestures Recognize body postures Identify faces (eventually) Track pointing devices (e.g. laser pointer) Select Optimal Camera for Remote Viewers Steer Cameras to track focus of attention –Perceptually enabled environments are good surrogates for sensor driven DoD applications (e.g. missile defense).

15 Command Post Demo (2 years old)

16 MetaGlue:A Platform for Perceptually Enabled Environments Naming and Discovery –Society, Role within Society, Required Properties –Societies are collection that act on behalf of a common entity (person or space) Central Registry –Discovery –Environmental Specific info –Storage of Agent State Communication –Direct Method Call (using RMI) Robustness –Freezing of State and Thawing on Automatic restart of dead-agents Dynamic Reloading of Agents during system execution Dynamic Collaboration Between Agents –Publish and Subscribe driven event interfaces

17 Agents Currently Provided in MetaGlue Control of Devices –X10 and similar simple sensors and effectors –Audio visual equipment and multiplexers Display and Screen Management Speech recognition components –Contextual grammars and command processing Visual processing –Laser pointer tracking –Face tracking for video conferencing Natural language processing –Interfaces to the start system

18 START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor START Blinds Drapes WWW Lamp Window Lamp Manager Lamp Door Mux Room VCR CD Player TV Mux Hal Tuner Cluster Learner Logger Notifier AgentTester Demo Manager Doc Retrieval Map Display Event Cluster Space Laser-1 Laser-2 X10 IR RS-232 Person Tracker Max Prob Grammar Agents Music Selector Audio Manager Display Manager Preference Learner Above Door Above Couch On Table On TV Vision Agents Command Post Living Room Info Retrieval Room State Eliza Room Tutor

19 The Need for Frameworks MetaGlue is a lightweight, distributed object infrastructure for perceptually enabled systems. Meta-Glue provides tools for integrating and dynamically connecting components, extensibility and for saving and restoring the state of components. The current incarnation of the system deals with the key modeling challenges with ad croc techniques. In MOBIES we are developing principled frameworks for these modeling challenges. Each framework provides languages, interfaces and guarantees for a specific set of concerns. These frameworks deal with semantics, context, resource management and robustness.

20 5 Key Modeling challenges How to model people, processes and perceptions so as to enable group interactions in multiple spaces How to model services so as to enable reasonably optimal use of resources How to model processes so as to recover from failures (e.g. equipment breakdown, failed assumptions) How to model perceptual events so as to coordinate and fuse information from many sensors How to model and exploit context

21 Grounding in Semantics We want to build applications that simultaneously service multiple people, organizations, physical spaces, sensors and effectors. The individuals move within and among many physical spaces The roles and responsibilities of individuals changes over time. The devices and resources they use change as time progresses The context shifts during interactions The relevant information base evolves over time.

22 Models and Knowledge Representations People –Interests, Skills, Responsibilities, Organizational Role Organizations –Members, structure, roles, processes and procedures. Spaces –Location, Subspaces –Devices and Resources contained in the space. Services: Methods, parameter bindings, resource requirements Agents: Capabilities, interfaces, society. Resources: Interfaces, capabilities, cost, reliability Information Nodes: Topic area, place in ontology, format Events –Changes in any of the above representations or in the system’s knowledge about any of the above. –E.g. person identification, Motion in space.

23 Abstracting Away from Specific Devices Until recently, applications were written in terms of specific resources without context (e.g. the left projector). This conflicts with: –Portability across physical contexts –Changes in equipment availability across time –Multiple applications demanding similar resources –Need to take advantage of new resources –Need to integrate mobile devices as they migrate into a space –Need to link two or more spaces What is required is a more abstract approach, a framework for resource management, in which no application needs to be tied to a specific device.

24 Framework 1: Service Mapping and Resource Management Users request abstract services from the Service Mapper –“I want to get a textual message to a system wizard” The Service Mapper has many plans for how to render each service –“Locate a wizard, project on a wall near her” –“Locate a wizard, use a voice synthesizer and a speaker near her” –“Print the message and page the Wizards to go to the printer” Each plan requires certain resources (and other abstract services) –Some resources are more valuable, scarce, utilized than others –The Resource Manager places a “price” on each resource Each plan provides the service with different qualities –Some of these are more desired by the user (higher benefit) The Service Mapper picks a plan which is (nearly) optimal –Maximum net benefit

25 Abstract Service Control Parameters User’s Utility Function The binding of parameters has a value to the user Resource 1,1 Resource 1,2 Resource 1,j Each Method Requires Different Resources The System Selects the Method Which Maximizes Net Benefit User Requests A Service with certain parameters Resource Cost Function The Resources Used by the Method Have a cost Net Benefit Each Method Binds the Settings of The Control Parameters in a Different Way Method 1 Method 2 Method n Each Service can be Provided by Several Methods Service Mapping and Resource Management

26 A Service Mapping Example I need to ask a question of a Systems Wizard Plan 1: Locate A Systems Wizard, Project on a wall near her Costs: Project (in use) high, person Location (high) Benefits: Fast, Clear Plan 2: Locate A Systems Wizard, Voice Synthesize on nearby speaker Costs: Load-Speaker (unused) mid, person location (high) Benefit: Fast, Catches attention Plan 3: Print on printer, Send a page on the Wizard Line Costs: Printer (busy) mid, pager (mid) Benefit: Mid Go to Printer The best plan under the Circumstances is Plan 3!

27 Framework 2: Recovery From Failures The Service Mapping framework renders services by translating them into plans involving physical resources Physical resources have known (and unknown) failure modes Each plan step accomplishes sub-goal conditions needed by succeeding steps –Each condition has some way of monitoring whether it has been accomplished –These monitoring steps are generated into the code implementing the plan If a sub-goal fails to be accomplished, the diagnostic infrastructure is invoked

28 Making the System Responsible for Achieving Its Goals Scope of Recovery Selection of Alternative Localization & Characterization Diagnostic Service Repair Plan Selector Resource Allocator Concrete Repair Plan Resource Plan alerts achieves requires A B Condition-1 prerequisite Monitor Rollback Designer Enactment

29 Diagnosis and Recovery Framework Model-based diagnosis isolates and characterizes the failure –Driven by detected discrepancies between expectations and observations –Each component has models of both normal and abnormal modes of operation –Selects set of components models consistent with observations A recovery is chosen based on the diagnosis –It might be as simple as “try it again”, we had a network glitch –It might be “try it again, but with a different selection of resources” –It might be as complex as “clean up and try a different plan”

30 Example of Recovering from Failures I don’t see light on the screen I see a wizard by the screen Locate a Wizard by the Screen Monitoring: check that wizard is still there Project the message Monitoring: Check that the person noticed the message Turn on selected projector Monitoring: Check that the projector is on The projector-1 must be broken We’ll try again, but use Projector-3 Plan Breakdown

31 Framework 3: Coordination of Perceptual Information We wish to separate the implementation of perceptual tasks from the uses to which perception is put Communication between modules focuses on “behavioral events” –An event is signaled whenever the state of any object in the model is perceived to change –E.g. a person moves, a person is identified, a device dies Communication is controlled by publish-subscribe interface –Each module publishes in central registry which events it can notice –Each module subscribes with registry for events of interest –Registry distributes to signalers list of consumers More exactly code to test list of consumers No centralized communications hub

32 The Event “Bus” For Perceptual Coordination Visual Tracke r I signal body motion Voice Identification I’m interested in body motion I signal the location of individuals Face Recognition I’m interested in face location I signal the location of individuals White Board Context Manager I’m interested in the location of individuals I signal people approaching the whiteboard Face Spotter I’m interested in body motion I signal face location

33 Coordination of Perceptual Information “Behavioral events” are organized into a taxonomy –Some behaviors are “close to the physics” –Some behaviors are more abstract (e.g. a person is near the white-board) Events are signaled with a certainty estimate Behaviors abstract over time –Recognizers take the form of Markov chains The same event can be signaled by different perceptual modules –both face and voice recognition can identify a person Recognizer code is synthesized from logical descriptions

34 Interaction of Frameworks: Perceptual Integration through Service Request For each behavior there is a corresponding Service This can be requested through the Service Mapping framework –Requested when the initial perception lacks confidence –Driven by diagnosis of perceptual breakdown Cause the system to marshal resources necessary to gather the additional information needed for disambiguation There is a body Certainty: high It’s Sally Certainty:very low There is a face Certainty:high Diagnosis: Bad Pose Recovery: Get Good Pose Service Request: Get Good Pose Plan: Synthesize Good Pose Get another view Visual Hull Interpolate It’s Sue Certainty:high

35 Framework 4: Contextual management of perception time = i time = i+1  context Context is a combination of: Information from all perceptual modalities Task Structure Context biases perceptual processing speech vision (eventually) perception Task structure

36 Using Context to Guide Speech Recognition Based on IBM’s ViaVoice Large assembly (~ 50) of software agents + + Many small grammars Each Appropriate to a Specific Context Dynamically activate & deactivate based on the interaction Simulate a very large set of supported utterances Reject utterances inappropriate to the Context + +

37 An Example of Context Management The attention of the system should be focussed by what people do and say. Speech recognition should be biased in favor of things going on at that time –Speech system is made up of many “grammar fragments” –Grammar fragments are activated (and deactivated) when perceptual events (visual or speech) suggest they should be Frobulate the sidetracker Activate the Drawing Grammar If Person ?P approaches space ?x and Grammar ?y is relevant to ?x Then Activate Grammar ?y Sally approaches space Whiteboard-1 White Board Place Manager Space Whiteboard-1 contains device mimeo-1 Mimeo devices are drawing devices The Drawing Grammar Is Relevant to space Whiteboard-1 If A space ?x contains a drawing device ?y Then the Drawing Grammar is relevant to space ?x

38 Framework 5: Application Frameworks for Display and Cmd Mgt An application manages: A set of displays A set of perceptual capabilities An underlying set of state-variables (the root nodes of its representation) Application loop: Accept a service request (a command) Perform the service Update the displays to reflect new state-variables Gives a sense of synchronicity at the high level even though at the lower level there are many distributed agents at work

39 Elements of Application Framework Display Manager State Variables Display

40 Summary First focus is on model-driven integration of autonomous vehicles –New modeling, and mode-identification techniques being developed Second focus is on perceptually guided environments –Several frameworks being developed to support specific issues –Service Mapping, Perceptual Coordination, Diagnosis and recovery, Contextual Biasing Common theme of structuring into frameworks along the lines of Dynamic Domain Architectures. Will soon investigate OEP domains as well.


Download ppt "Dynamic Domain Architectures for Model-based Autonomy Bob Laddaga Howard Shrobe Brian C. Williams (PI) MIT Artificial Intelligence Lab Space Systems Lab."

Similar presentations


Ads by Google