Download presentation
Presentation is loading. Please wait.
Published byDale Morton Modified over 9 years ago
1
A Common Ground for Virtual Humans: Using an Ontology in a Natural Language Oriented Virtual Human Architecture Arno Hartholt (ICT), Thomas Russ (ISI), David Traum (ICT), Eduard Hovy (ISI), Susan Robinson (ICT) 05/30/2008
2
2 Overview Virtual Humans Project Challenge Virtual Humans Architecture Ontology Conclusion Future Work
3
3 Virtual Humans Project
4
4 Reasoning and Emotion Reasoning and Emotion Behavior Animation Behavior Animation Non-Verbal Behavior Sensing Non-Verbal Behavior Sensing Natural Language Dialogue Natural Language Dialogue Non-Verbal Behavior Understanding Non-Verbal Behavior Understanding Natural Language Understanding Natural Language Understanding Knowledge Representation Knowledge Representation Speech Processing Speech Processing Virtual Human Architecture Virtual Human Architecture Non-Verbal Behavior Generation Non-Verbal Behavior Generation Natural Language Generation Natural Language Generation Conversational Minor Characters Conversational Minor Characters Reasoning and Emotion Reasoning and Emotion Behavior Animation Behavior Animation Non-Verbal Behavior Sensing Non-Verbal Behavior Sensing Natural Language Dialogue Natural Language Dialogue Non-Verbal Behavior Understanding Non-Verbal Behavior Understanding Natural Language Understanding Natural Language Understanding Knowledge Representation Knowledge Representation Speech Processing Speech Processing Virtual Human Architecture Virtual Human Architecture Non-Verbal Behavior Generation Non-Verbal Behavior Generation Natural Language Generation Natural Language Generation Conversational Minor Characters Conversational Minor Characters
5
5 Challenge – Knowledge Representation Module-specific representation –How to communicate between modules? –How to make sure translations are correct? Uniform representation –Common understanding –Reuse –Impoverished or rich? NLU DialogueNLG Gesture Task Modeling Speech
6
6 Our Process Multi-phase project life cycle First, choose individual representations, suitable for the state of the art in that area Next, bring languages closer together
7
7 Solution - Old and New Architecture Old architecture NL Generation NL Understanding Dialogue Management Action Planning Emotion Reasoning Gesture Generation Speech Recognition Speech Synthesis Ontology1 Interface1 Ontology2 Interface2 Ontology3 Interface3 Ontology4 Interface4 New architecture NL Generation NL Understanding Dialogue Management Action Planning Emotion Reasoning Gesture Generation Speech Recognition Speech Synthesis Common Ontology Central Interface
8
8 Knowledge Representation Static knowledge: –Offline authoring –Social / psychological behavior –Domain knowledge Dynamic knowledge: –Runtime –Current state / events –Communication through message protocol
9
9 Environment Natural Language Understanding Natural Language Understanding Speech Recognition Speech Recognition Non-Verbal Behavior Generator Non-Verbal Behavior Generator Smartbody Procedural Animation Planner Smartbody Procedural Animation Planner Speech Generation Speech Generation Dialog and Discourse Management Dialog and Discourse Management Emotion Model Emotion ModelTask Planner Body Mind Real Environment Vision Recognition Vision Recognition Vision Understanding Vision Understanding Visual Game Engine Domain Specific Knowledge Domain Specific Knowledge Domain Independent Knowledge Domain Independent Knowledge World State Protocol World State Protocol Knowledge Management Intelligent Cognitive Agent Virtual Humans Architecture Human Trainee “I want to move the clinic” Cognition Virtual Human Body and Affective State Management Body and Affective State Management Natural Language Generation Natural Language Generation Speech-act: statement Event: move Agent: captain Theme: clinic Body Planning Speech-act: statement Polarity: negative Object: market Attribute: resource Value: medical-supplies “We have no medical supplies”
10
10 Focus Task model and NLU Domain dependent knowledge Strong interaction Big authoring effort
11
11 Task Model – Tasks & States
12
12 Task Model - Plans
13
13 Task Model - Whole
14
14 Task Model - Task defTask doctor-moves { :agent doctor :theme clinic :event move :source here :destination there :instrument locals :pre {have-transport clinic-here } :add {clinic-there} :del {clinic-here} }
15
15 Task Model - State defState clinic-here location clinic here \ :belief true \ :initialize here \ :probability 0.8 \ :concern {doctor 20} :sim-object *none*
16
16 Natural Language Understanding (NLU) i want to move the clinic.meta.id mcsw.mood declarative.sem.speechact.type statement.sem.modal.desire want.sem.type event.sem.event move.sem.theme clinic.sem.source here.sem.destination there
17
17 Natural Language Understanding (NLU) Framebank: sets of semantic frame / utterance tuples NLU trains on framebank Either building frames or retrieving frames
18
18 Frames Ontology Core Concepts Module- specific knowledge uses Exporters Module 1 Module 2
19
19 Frames Ontology – Core Concepts event move agent captain-kirk theme clinic source market destination downtown object clinic attribute location value downtown Task: State:
20
20 Frames Ontology – Task Model event move agent captain-kirk theme clinic source market destination downtown pre: clinic-location-market del: clinic-location-market add: clinic-location-downtown object clinic attribute location value downtown belief false concern {doctor-perez 10} Task: State:
21
21 Frames Ontology – Natural Language mood declarative sem.speechact.type statement sem.modality.deontic must sem.polarity positive sem.type event sem.event move sem.agent captain-kirk sem.theme clinic sem.source market sem.destination downtown
22
22 OWL Hierarchy Assertions on all levels Automatic classification
23
23 OWL Ontology Scenario- independent Ontology Scenario Family 3 Scenario Family 2 Scenario family 1 Scenario 3-1 Scenario 2-2 Scenario 2-1 Scenario 1-2 Scenario 1-1 General world knowledge (generic actions, objects…) Linguistic structures Generic dialogue items Scenario actor classes Scenario action templates (generic preconds, effects...) Propositions for scenario Specific actors, locations, etc. Sentences for scenario Actor positions, attitudes, etc. Simulation initialization Configuration File Configuration file
24
24 Ontology Action Templates Templates define semantics of action: Ex: One effect of a move action is that the destination will contain the thing being moved (the ‘theme’) Hierarchy allows inheritance of semantics
25
25 Creating a new move action 1. Select class 3. Add details agent theme source destination 3. Add details agent theme source destination 2. Create instance 4. Inherit details Preconditions, effects, etc. are added automatically (and can be edited) 4. Inherit details Preconditions, effects, etc. are added automatically (and can be edited)
26
26 Creating a new move utterance 1. Select class 3. Add details semantic object speech act modality addressee 3. Add details semantic object speech act modality addressee 2. Create instance 4. Add utterance(s)
27
27 Conclusion Multiple phase life cycle allows quick prototyping Ontology pros –Module synchronization –Formal specification –Reuse of knowledge –Reference to data, rather than copy –Common user interface Ontology cons –Extra learning curve –Some changes are more difficult –User interface not ideal
28
28 Future Work Extend natural language capabilities (NLG, Lexicon) Integrating other modules Completing consistency and integrity testing routines Exploring using existing resources We want to enable non-Computer Science people to author new scenarios…so we’re investigating the optimal point in the trade-off between: –Programming for Non-Experts: Reduced, simple, scripting languages BUT limited in functionality –Powertools for Experts: High functionality BUT need considerable expertise Simpler, but limited More powerful, but complex
29
Questions Thank you
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.