Synthetic Teammate Project March 2009

Slides:



Advertisements
Similar presentations
Pat Langley School of Computing and Informatics Arizona State University Tempe, Arizona USA Modeling Social Cognition in a Unified Cognitive Architecture.
Advertisements

ARCHITECTURES FOR ARTIFICIAL INTELLIGENCE SYSTEMS
Chapter 11 user support. Issues –different types of support at different times –implementation and presentation both important –all need careful design.
Introducing Formal Methods, Module 1, Version 1.1, Oct., Formal Specification and Analytical Verification L 5.
Toward a Large-Scale Model of Language Comprehension in ACT-R 6 July 2007 Jerry Ball 1, Andrea Heiberg 2, and Ronnie Silber 3 Air Force Research Laboratory.
Second Language Acquisition
Context Accommodation in Human Language Processing June 2010 Jerry T. Ball Senior Research Psychologist 711 th HPW / RHAC Air Force Research Laboratory.
Language and Cognition Colombo, June 2011 Day 8 Aphasia: disorders of comprehension.
Projecting Grammatical Features in Nominals: 23 March 2010 Jerry T. Ball Senior Research Psychologist 711 th HPW / RHAC Air Force Research Laboratory DISTRIBUTION.
Object-Oriented Software Construction Bertrand Meyer 2nd ed., Prentice Hall, 1997.
SAL: A Hybrid Cognitive Architecture Y. Vinokurov 1, C. Lebiere 1, D. Wyatte 2, S. Herd 2, R. O’Reilly 2 1. Carnegie Mellon University, 2. University of.
A Naturalistic, Functional Approach to NLU November 2008 Jerry Ball Air Force Research Laboratory.
UNIT-III By Mr. M. V. Nikum (B.E.I.T). Programming Language Lexical and Syntactic features of a programming Language are specified by its grammar Language:-
Basic Expressions July 2007 Jerry Ball 1, Andrea Heiberg 2 Air Force Research Laboratory 1 L3 Communications 2.
Cognitive Modeling in the Large July 2008 Jerry Ball Human Effectiveness Directorate 711 th Human Performance Wing Air Force Research Laboratory.
ISBN Chapter 3 Describing Syntax and Semantics.
Chapter 20: Natural Language Generation Presented by: Anastasia Gorbunova LING538: Computational Linguistics, Fall 2006 Speech and Language Processing.
Modeling Long-Distance Dependencies in Double R July 2008 Jerry Ball Human Effectiveness Directorate Air Force Research Laboratory.
1 Words and the Lexicon September 10th 2009 Lecture #3.
I1-[OntoSpace] Ontologies for Spatial Communication John Bateman, Kerstin Fischer, Reinhard Moratz Scott Farrar, Thora Tenbrink.
C. Varela; Adapted w/permission from S. Haridi and P. Van Roy1 Declarative Computation Model Defining practical programming languages Carlos Varela RPI.
Constraint Logic Programming Ryan Kinworthy. Overview Introduction Logic Programming LP as a constraint programming language Constraint Logic Programming.
NaLIX: A Generic Natural Language Search Environment for XML Data Presented by: Erik Mathisen 02/12/2008.
Help and Documentation zUser support issues ydifferent types of support at different times yimplementation and presentation both important yall need careful.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
PSY 369: Psycholinguistics Some basic linguistic theory part3.
Polyscheme John Laird February 21, Major Observations Polyscheme is a FRAMEWORK not an architecture – Explicitly does not commit to specific primitives.
C SC 620 Advanced Topics in Natural Language Processing 3/9 Lecture 14.
Describing Syntax and Semantics
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
Lecture 1, 7/21/2005Natural Language Processing1 CS60057 Speech &Natural Language Processing Autumn 2005 Lecture 1 21 July 2005.
CASE Tools And Their Effect On Software Quality Peter Geddis – pxg07u.
PSY 369: Psycholinguistics Language Production & Comprehension: Conversation & Dialog.
9/8/20151 Natural Language Processing Lecture Notes 1.
Modeling Driver Behavior in a Cognitive Architecture
For Friday Finish chapter 23 Homework: –Chapter 22, exercise 9.
Psychology: memory. Overview An understanding of human memory is critical to an appreciation of how users will store and use relevant information when.
Mathematical Modeling and Formal Specification Languages CIS 376 Bruce R. Maxim UM-Dearborn.
A Cognitive Substrate for Natural Language Understanding Nick Cassimatis Arthi Murugesan Magdalena Bugajska.
1 Computational Linguistics Ling 200 Spring 2006.
4/2/03I-1 © 2001 T. Horton CS 494 Object-Oriented Analysis & Design Software Architecture and Design Readings: Ambler, Chap. 7 (Sections to start.
Benjamin Gamble. What is Time?  Can mean many different things to a computer Dynamic Equation Variable System State 2.
SOCIO-COGNITIVE APPROACHES TO TESTING AND ASSESSMENT
Construction Driven Language Processing May 2007 Jerry T. Ball Senior Research Psychologist Air Force Research Laboratory Mesa, AZ.
An Intelligent Analyzer and Understander of English Yorick Wilks 1975, ACM.
Unit-1 Introduction Prepared by: Prof. Harish I Rathod
Chapter 8 Object Design Reuse and Patterns. Object Design Object design is the process of adding details to the requirements analysis and making implementation.
CSA2050 Introduction to Computational Linguistics Lecture 1 Overview.
Introduction to Computational Linguistics
Chap#11 What is User Support?
1. 2 Preface In the time since the 1986 edition of this book, the world of compiler design has changed significantly 3.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Testing OO software. State Based Testing State machine: implementation-independent specification (model) of the dynamic behaviour of the system State:
EEL 5937 Content languages EEL 5937 Multi Agent Systems Lecture 10, Feb. 6, 2003 Lotzi Bölöni.
CognitiveViews of Learning Chapter 7. Overview n n The Cognitive Perspective n n Information Processing n n Metacognition n n Becoming Knowledgeable.
A Simple English-to-Punjabi Translation System By : Shailendra Singh.
Eugene Nida
VISUAL WORD RECOGNITION. What is Word Recognition? Features, letters & word interactions Interactive Activation Model Lexical and Sublexical Approach.
 System Requirement Specification and System Planning.
Advanced Computer Systems
Cognitive Language Processing for Rosie
Cognitive Processes in SLL and Bilinguals:
SysML v2 Formalism: Requirements & Benefits
CSc4730/6730 Scientific Visualization
Chapter 11 user support.
LINGUA INGLESE 2A – a.a. 2018/2019 Computer-Aided Translation Technology LESSON 3 prof. ssa Laura Liucci –
ONTOMERGE Ontology translations by merging ontologies Paper: Ontology Translation on the Semantic Web by Dejing Dou, Drew McDermott and Peishen Qi 2003.
System Model Acquisition from Requirements Text
Visual Grounding.
Presentation transcript:

Synthetic Teammate Project March 2009 Jerry Ball Air Force Research Laboratory

Synthetic Teammate Project Project Goal: Develop a Synthetic Teammate capable of functioning as the Air Vehicle Operator (AVO) or pilot in a 3-person simulation of a Unmanned Air Vehicle (UAV) performing reconnaissance missions Cognitively Plausible Using ACT-R Functional Large-scale Empirically Validated Not valid if it’s not functional! few research teams attempting to do these at once!

Synthetic Teammate Project Guiding principle: Don’t use any computational techniques which are obviously cognitively implausible Key Assumption: Adhering to well-established cognitive constraints may actually facilitate development by pushing development in directions that are more likely to be successful Short-term costs associated with adherence to cognitive constraints may ultimately yield long-term benefits Don’t know what you’re giving up when you adopt cognitively implausible techniques

Synthetic Teammate Project Collaborative project between the Air Force Research Laboratory (AFRL) and Cognitive Engineering Research Institute (CERI) Applied research funds from AFRL/RHA Basic research funds from AFOSR Basic research funds from ONR Using the Cognitive Engineering Research on Team Tasks (CERTT) Synthetic Task Environment (STE) Developed with funds from AFOSR

CERTT Synthetic Task Environment Team Goal: Fly UAV Reconnaissance Missions PLO (takes pics) DEMPC (plans route) AVO (flies UAV)

UAV Reconnaissance Missions AVO, DEMPC and PLO collaborate to complete a 40 minute reconnaissance mission AVO must fly UAV past a sequence of waypoints which are determined by the DEMPC and communicated to the AVO as a flight plan Waypoints may have altitude and airspeed restrictions and have an effective radius for fly by Route based restrictions, waypoint type and effective radius must be communicated from DEMPC to AVO Photo restrictions must be communicated from PLO to AVO PLO must take pictures of target waypoints within the effective radius, but does not take pictures of entry and exit waypoints

Importance of Communication Communication is critical to the success of reconnaissance missions PLO and DEMPC must communicate restrictions to AVO DEMPC must communicate flight plan to AVO When the unexpected happens—e.g. unplanned waypoint added to mission, photo missed—teammates must develop workarounds and communicate adjustments

AVO Workstation Instruments Warnings Text Chat DEMPC to AVO: LVN is our first waypoint AVO to INTEL: Copy INTEL to all: OK team, mission 1, good luck. Text Chat Are there any restrictions for LVN?

Synthetic Teammate Integration AVO Synthetic AVO Teammate

Synthetic Teammate Integration Standalone Mode Using an agent development framework to provide “light-weight” implementations of the DEMPC and PLO for development purposes Low-cognitive fidelity, scripted agents Eliminate need to have humans acting as DEMPC and PLO during development

System Overview Text Chat Text Chat Input Output Motor Visual Actions Dialog Manager Text Chat Input Language Comprehension Language Generation Text Chat Output Situation Model Motor Actions Visual Input Task Behavior Model

System Overview Text Chat Text Chat Input Output Motor Visual Actions Dialog Manager Text Chat Input Language Comprehension Language Generation Text Chat Output Situation Model Motor Actions Visual Input Task Behavior Model

Language Comprehension Theory of Language Processing (Ball 2007…1991) Activation, selection and integration of constructions corresponding to the linguistic input Nearly deterministic, serial processing mechanism (integration) operating over a parallel, probabilistic (constraint-based) substrate (activation & selection) Theory of Linguistic Representation (Ball 2007) Focus on encoding of referential and relational meaning Implemented in a Computational Cognitive Model Using the ACT-R Cognitive Architecture Adheres to well-established Cognitive Constraints

Cognitive Constraints Incremental processing – word by word Interactive processing – lexical, syntactic, semantic, pragmatic and task environment information used simultaneously to guide processing Highly context sensitive – but limited to preceding context (no access to subsequent context) Word recognition and part-of-speech determination integrated with higher-level syntactic, semantic and discourse processing (single pass) Robust processing Must handle ungrammatical input, incorrectly spelled words and non-sentential input Minimize number of “hard constraints” (e.g. whole word matching) which can lead to failure when they aren’t satisfied

Cognitive Constraints  Processing Mechanisms Serial, nearly deterministic (controlled) processing operating over a parallel, probabilistic (automatic) substrate Parallel, probabilistic substrate interactively integrates all contextual information leading to selection of the best choice given the available local context at each incremental choice point Soft constraints or biases Once a choice is made the processor proceeds serially and deterministically forward in real-time When a locally preferred choice turns out to be dispreferred in wider context, context sensitive context accommodation mechanism kicks in

Language Processing in the Model The following example is from the Language Processing Model “no airspeed or altitude restrictions”

“no”  object specifier  object referring expression = nominal construction

“airspeed”  object head no airspeed integration Tree structures created from output of model automatically with a tool for dynamic visualization of ACT-R declarative memory (Heiberg, Harris & Ball 2007)

“airspeed or altitude”  object head no airspeed or altitude override Accommodation of conjunction via function overriding

“airspeed or altitude”  modifier “restrictions”  object head no airspeed or altitude restrictions shift Appearance of parallel processing! airspeed or altitude = head vs. airspeed or altitude = mod Accommodation of new head via function shift

Computational Constraints Processor needs to operate in near real-time to be functional Large-scale systems that can’t handle non-determinism efficiently (e.g. Context-Free Grammars) typically collapse under their own weight Deterministic processing is computationally efficient Probabilistic and Parallel processing—often combined with a limited “spot light”—are alternative mechanisms for dealing with non-determinism Parallel processing can be computationally explosive on serial hardware Forced to use some “hard constraints”—e.g. first letter match—in word recognition subcomponent

Computational Constraints No limited domain assumption to simplify model CERTT text chat shows broad range of grammatical constructions and thousands of lexical items Relational database integrated with ACT-R to support scaling up model to a full mental lexicon Plan to integrate sizeable subset ( > 15,000 lexical items) of most common words in WordNet lexicon ( > 100,000 lexical items) Can’t ignore lexical ambiguity! Study underway to compare performance of model when Declarative Memory (DM) is stored in an external DB vs. internal Lisp process Internal Lisp process is faster for small DM, but can only handle 30% of WordNet before running out of memory!

Start with a Domain General Language Processing System Contains 2000 most common words in English and 2500 words in total Handles a broad range of construction types Declarative, Imperative, Yes-No Question, Wh-Question Intransitive, Transitive & Ditransitive Verbs, Verbs with Clausal Complements, Predicate Nominals, Predicate Adjectives and Predicate Prepositions Specifier, Head, Complement, Pre- and Post-Head Modifier Conjunctions of numerous functional categories Relative Clauses, Wh-Clauses, Infinitive, -ing, -en & Bare Verb Clauses Long-distance dependencies Passive constructions

Start with a Domain General Language Processing System Representations are in the spirit of the “Simpler Syntax” of Culicover & Jackendoff (2005) except that there are no purely syntactic representations Referring Expression Functional Categories Predicates Semantic Features Trace bound to subject He is eager to please.

Extend to Handle Scripted Comm AVO: DEMPC, please let me know the first waypoint! DEMPC: The first waypoint is LVN. It’s an entry point. There are no airspeed or altitude restrictions. The effective radius is 2.5 miles. AVO: PLO, I’m heading towards LVN. DEMPC: We’re within the effective radius so go to the second waypoint. AVO: Are there any altitude or airspeed restrictions for the second waypoint? DEMPC: The second waypoint is H-AREA. It’s a target. The airspeed restriction is between 50 and 200 knots. There is no altitude restriction. The effective radius is 5 miles. PLO: AVO, please keep the altitude over 3000 feet for the photo! PLO: I have a good photo of H-AREA.

Scripted Comm Full sentences Correct spelling Explicit discourse acts Still lots of variability Declarative sentences Imperative sentences Questions Conjunctions

Extend to Handle Text Chat for a 40 Minute Mission – without editing! PLO to AVO: avo-don't ever proceed from a target if i haven't taken the picture AVO to PLO: ok -- keep me in the loop! INTEL to all: ok team, mission 2 PLO to AVO: effective radiu PLO to AVO: avo i need to be below 3000 AVO to PLO: copy, will 2000 do? DEMPC to AVO: LVN is our 1st entry point with a radius of 2.5 AVO to PLO: speed? AVO to DEMPC, PLO: 1 mile out/ 30 seconds PLO to AVO: i don't have a speed for lvn so go faster AVO to DEMPC, PLO: speed 340 PLO to AVO: avo i'll need to be above 3000 for h area AVO to PLO: above 3000 copy -- can we proceed to h-area yet?

Extend to Handle Text Chat for a 40 Minute Mission – without editing! PLO to AVO: lets get out of effective zone DEMPC to AVO: Speed=50-200, Altitude=500-2000 AVO to DEMPC, PLO: wait -- my flight plan changed -- are we going to Z1? PLO to AVO: can yougo faster yet or is it stll 200 DEMPC to AVO: no speed or alt. restrictions PLO to AVO: avo i need to be above 3000 for s ste- go there when you think it would be most effective PLO to AVO: avo 3000 DEMPC to AVO: YES to S-StE=Target PLO to AVO: `avo get back within 5 miles of s ste PLO to AVO: aavo dont slow down

Handle Communication with Unscripted Human DEMPC and PLO Language varies significantly from team to team Can’t predict vocabulary requirements in advance Teams adapt particular ways of communicating which can’t be predicted in advance Text becomes more cryptic as mission continues Discourse acts are often implicit

Word Recognition Subcomponent Word recognition subcomponent largely compatible with the E-Z Reader model of reading (cf. Reichle, Warren & McConnell 2009) with extensions to support higher-level language processing Perceptual window used for low-level processing of linguistic input Model can “see” space delimited “word” in focus of attention Model can “see” up to first 3 letters of word in right periphery following space Retrieved word is verified against actual input Consistent with Activation-Verification model of Word Recognition (Paap et al. 1982)

Word Recognition Word recognition is an interaction between low-level perceptual and higher-level cognitive processing Perceptually identified letters, trigrams and space delimited “words” spread activation to words (and multi-word units) in DM Most-highly activated word or multi-word unit consistent with retrieval template is retrieved Need not be a space delimited “word”

Generating Linguistic Representations Incremental, interactive generation of linguistic representations which encode referential and relational meaning Referring Expressions Relations He is eager to please.

Mapping into the Situation Model Referring expressions in the linguistic representation get mapped to objects and situations in the situation model Indefinite object referring expression typically introduces a new object into the situation model Definite object referring expression typically identifies and existing object either in the situation model or salient in the context Situation referring expressions typically introduce a new relation into the situation

System Overview Text Chat Text Chat Input Output Motor Actions Visual Dialog Manager Text Chat Input Language Comprehension Language Generation Text Chat Output Motor Actions Situation Model Visual Input Task Behavior Model

Centrality of Situation Model Task Behavior World Knowledge Situation Model Language Output Language Input Language Knowledge Task Input Domain Knowledge

Situation Model Situation Model (Zwann & Radvansky, 1998) Spatial-Imaginal (and Temporal) representation of the objects and situations described by linguistic expressions and encoded directly from the environment Non-propositional (at least in part) Non-textual No available computational implementations Provides grounding for linguistic representations Integrates task environment and linguistic information

Abstract Concepts vs. Perceptually Grounded Language The Prevailing View An Emerging View Real World Mental Box Real World Mental Box “pilot” perception Language of Thought “pilot” “pilot” grounding PILOT Explicit (Perceptual) Implicit (Abstract) perception

Abstract Concepts vs. Perceptually Grounded Language The Prevailing View An Emerging View Real World Mental Box Real World Mental Box “pilot” perception Language of Thought “pilot” “pilot” grounding PILOT Explicit (Perceptual) Implicit (Abstract) perception

Situation Model Propositional Content Planning to use Hobbs’ theory of “ontological promiscuity” and his well-developed logical notation (translated into ACT-R chunks) to represent propositional content The logical notation should be as close to English as possible The logical notation should be syntactically simple to support inferencing

Situation Model Spatial Content Planning to use Scott Douglass’ spatial module extension to ACT-R which implements a matrix-like representation of spatial information Discourse Content Working on identification and representation of Discourse Acts which are often only implied in linguistic input “I need to be above 3000 feet for the photo” This is a request to increase the altitude of the UAV (human is not actually in UAV)

Empirical Validation Experiment conducted with human subjects in conditions using 1) spoken language and 2) text chat to provide data for model development AVO station moved into separate room so DEMPC and PLO don’t see AVO Text chat condition showed team performance effect similar to spoken language condition Goal is to conduct an experiment with Synthetic AVO Teammate interacting with human DEMPC and PLO

Questions?