Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei.

Slides:



Advertisements
Similar presentations
Object Persistence for Synthetic Characters Damian Isla Bungie Studios Microsoft Corp. Bruce Blumberg Synthetic Characters MIT Media Lab.
Advertisements

Cognitive Systems, ICANN panel, Q1 What is machine intelligence, as beyond pattern matching, classification and prediction. What is machine intelligence,
XML Technology in E-Commerce
Nonverbal Components of Delivery
Irek Defée Signal Processing for Multimodal Web Irek Defée Department of Signal Processing Tampere University of Technology W3C Web Technology Day.
RRL: A Rich Representation Language for the Description of Agent Behaviour in NECA Paul Piwek, ITRI, Brighton Brigitte Krenn, OFAI, Vienna Marc Schröder,
Computer Animation Rick Parent Computer Animation Algorithms and Techniques Figure Animation.
Presented by: Thabet Kacem Spring Outline Contributions Introduction Proposed Approach Related Work Reconception of ADLs XTEAM Tool Chain Discussion.
Facial expression as an input annotation modality for affective speech-to-speech translation Éva Székely, Zeeshan Ahmed, Ingmar Steiner, Julie Carson-Berndsen.
 INTRODUCTION  STEPS OF GESTURE RECOGNITION  TRACKING TECHNOLOGIES  SPEECH WITH GESTURE  APPLICATIONS.
Shared Workspaces: Behavioural Foundations Petra Neumann 781 October 12 th, 2005.
KAIST CS780 Topics in Interactive Computer Graphics : Crowd Simulation A Task Definition Language for Virtual Agents WSCG’03 Spyros Vosinakis, Themis Panayiotopoulos.
Overview of Computer Vision CS491E/791E. What is Computer Vision? Deals with the development of the theoretical and algorithmic basis by which useful.
Supervised by Prof. LYU, Rung Tsong Michael Department of Computer Science & Engineering The Chinese University of Hong Kong Prepared by: Chan Pik Wah,
UML CASE Tool. ABSTRACT Domain analysis enables identifying families of applications and capturing their terminology in order to assist and guide system.
Communicating with Avatar Bodies Francesca Barrientos Computer Science UC Berkeley 8 July 1999 HCC Research Retreat.
Distributed Collaborations Using Network Mobile Agents Anand Tripathi, Tanvir Ahmed, Vineet Kakani and Shremattie Jaman Department of computer science.
Body Language. Definition Body language is the language transmitted by gestures and postures.
1 Layered Avatar Behavior Model and Script Language for the Object Interaction Jae-Kyung Kim
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
TuniSigner: An avatar-based system to interpret SignWriting notations Yosra Bouzid & Mohamed Jemni Research Laboratory LaTICE, University of Tunis, Tunisia.
GUI: Specifying Complete User Interaction Soft computing Laboratory Yonsei University October 25, 2004.
NON-VERBAL COMMUNICATION
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Recognition of meeting actions using information obtained from different modalities Natasa Jovanovic TKI University of Twente.
Expressive Emotional ECA ✔ Catherine Pelachaud ✔ Christopher Peters ✔ Maurizio Mancini.
Working group on multimodal meaning representation Dagstuhl workshop, Oct
Markup of Multimodal Emotion-Sensitive Corpora Berardina Nadja de Carolis, Univ. Bari Marc Schröder, DFKI.
Chapter 7. BEAT: the Behavior Expression Animation Toolkit
Three Topics Facial Animation 2D Animated Mesh MPEG-4 Audio.
Break-out Group # D Research Issues in Multimodal Interaction.
APML, a Markup Language for Believable Behavior Generation Soft computing Laboratory Yonsei University October 25, 2004.
Towards Cognitive Robotics Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Christian.
“Low Level” Intelligence for “Low Level” Character Animation Damián Isla Bungie Studios Microsoft Corp. Bruce Blumberg Synthetic Characters MIT Media Lab.
10/18/20151 Business Process Management and Semantic Technologies B. Ramamurthy.
March 1, 20021ICT Virtual Human Workshop HUMAN FIGURE ANIMATION Norman I. Badler Center for Human Modeling and Simulation University of Pennsylvania Philadelphia,
Greta MPEG-4 compliant Script based behaviour generator system: Script based behaviour generator system: input - BML or APML input - BML or APML output.
Sign Language – 5 Parameters Sign Language has an internal structure. They can be broken down into smaller parts. These parts are called the PARAMETERS.
Class 5 Architecture-Based Self-Healing Systems David Garlan Carnegie Mellon University.
1 5 Nov 2002 Risto Pohjonen, Juha-Pekka Tolvanen MetaCase Consulting AUTOMATED PRODUCTION OF FAMILY MEMBERS: LESSONS LEARNED.
卓越發展延續計畫分項三 User-Centric Interactive Media ~ 主 持 人 : 傅立成 共同主持人 : 李琳山,歐陽明,洪一平, 陳祝嵩 水美溫泉會館研討會
1 MPML and SCREAM: Scripting the Bodies and Minds of Life-Like Characters Soft computing Laboratory Yonsei University October 27, 2004.
A Conceptual Framework For Mapping Business Rules To The Enterprise Data Model Sudha Ram, Jun Liu Department of Management Information Systems The University.
Multimedia System and Networking UTD Slide- 1 University of Texas at Dallas B. Prabhakaran Rigging.
ENTERFACE 08 Project 1 “MultiParty Communication with a Tour Guide ECA” Mid-term presentation August 19th, 2008.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
1 1. Representing and Parameterizing Agent Behaviors Jan Allbeck and Norm Badler 연세대학교 컴퓨터과학과 로봇 공학 특강 학기 유 지 오.
Österreichisches Forschnungsinstitut für Artificial Intelligence Representational Lego for ECAs Brigitte Krenn.
Feedback Elisabetta Bevacqua, Dirk Heylen,, Catherine Pelachaud, Isabella Poggi, Marc Schröder.
Multimodality, universals, natural interaction… and some other stories… Kostas Karpouzis & Stefanos Kollias ICCS/NTUA HUMAINE WP4.
4 November 2000Bridging the Gap Workshop 1 Control of avatar gestures Francesca Barrientos Computer Science Division UC Berkeley.
Introduction to Interactive Media Interactive Media Tools: Authoring Applications.
Human Figure Animation. Interacting Modules The ones identified –Speech, face, emotion Plus others: –Perception –Physiological states.
Animated Speech Therapist for Individuals with Parkinson Disease Supported by the Coleman Institute for Cognitive Disabilities J. Yan, L. Ramig and R.
1 Galatea: Open-Source Software for Developing Anthropomorphic Spoken Dialog Agents S. Kawamoto, et al. October 27, 2004.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
O The Feedback Culture. o Theories of Communication. o Barriers. REVIEW.
WP6 Emotion in Interaction Embodied Conversational Agents WP6 core task: describe an interactive ECA system with capabilities beyond those of present day.
XML Extensible Markup Language
Introduction to MPEG  Moving Pictures Experts Group,  Geneva based working group under the ISO/IEC standards.  In charge of developing standards for.
UCL Human Representation in Immersive Space. UCL Human Representation in Immersive Space Body ChatSensing Z X Y Zr YrXr Real–Time Animation.
Presented By Meet Shah. Goal  Automatically predicting the respondent’s reactions (accept or reject) to offers during face to face negotiation by analyzing.
Interpreting Ambiguous Emotional Expressions Speech Analysis and Interpretation Laboratory ACII 2009.
Recognition and Expression of Emotions by a Symbiotic Android Head Daniele Mazzei, Abolfazl Zaraki, Nicole Lazzeri and Danilo De Rossi Presentation by:
1. Relating Human Markup Language to the Web Services Component Model 1.0 The Human Markup Language-HumanML Codifying Human Characteristics  Basic XML.
Survey of Affective Computing for Digital Home
Implementing Language Extensions with Model Transformations
Communicating with Avatar Bodies
Implementing Language Extensions with Model Transformations
Business Process Management and Semantic Technologies
Presentation transcript:

Toward a Unified Scripting Language 1 Toward a Unified Scripting Language : Lessons Learned from Developing CML and AML Soft computing Laboratory Yonsei University October 1, 2004

Toward a Unified Scripting Language 2 1. Introduction Two approaches to specifying scripting languages for character animation –Character Markup Language(CML) Top-down approach by defining high-level attributes for character personality, emotion and behavior that are integrated to form the specification of synchronized animation script –Avatar Markup Language(AML) Bottom-up approach in which the language provides a generic mechanism for the selection and synchronized merging of animations Need for powerful yet generic scripting languages to bridge the gap between behavior generation and animation tools

Toward a Unified Scripting Language 3 Visual Behavior Definition –A third factor that governs character behavior The actions an agent needs to perform in a session to achieve given tasks Personality and current mental state The role the agent is given –The behaviors are defined as XML tags 2. Scripting with the Character Markup Language CML Processor High-level behavior tags (map appropriate action point parameters) Animation Script

Toward a Unified Scripting Language 4 2. Scripting with the Character Markup Language Classification of Motion –Derived from Blumberg and Russell’s architecture (three- layer structure : geometry, motor, behavior system) –The initial set of CML base motions is classified by the goal of the motion as follows: Movement defines motion that require the rotation or movement of a character from one position to another. Positions are defined by exact coordinates, an object position, or a character position. (move-to, turn-to) Pointing defines a pointing gesture toward a coordinate, object, or character. (point-to) Grasping defines motions that require the character to hold, throw, or come in contact with an object or another character (grasp, throw, touch) Gaze defines the movements related to the head and eyes. (Gaze are gaze, track, blink, look-to, and look-at. ) Gesture includes motions that represent known gestures like hand movement to convey an acknowledgment, a wave, etc. (gesture-at)

Toward a Unified Scripting Language 5 2. Scripting with the Character Markup Language CML Specification –Animated character behavior is expressed through the interpretation of XML Schema structures –The language contains Low-level tags (specific character gesture representations defining movements, intensities, and explicit expression) High-level tags (commonly used combinations of these low- level tags –Synchronization between the audio and visual modalities is achieved through the use of SMIL (, )

Toward a Unified Scripting Language 6 2. Scripting with the Character Markup Language CML Representation Language –Head Gesture Taxonomy Symbolic gestures, Iconic gestures, Deictic gesture –Hand Gestures Taxonomy Posture, Motion, Orientation, ‘Gestlets’, Fingers –Body Gestures Taxonomy Natural, Relax, Tense, Iconic, Incline –Emotions (based on the OCC theory of emotion) Class, Valence, Subject, Target, Intensity, Time-stamp, Origin

Toward a Unified Scripting Language 7 2. Scripting with the Character Markup Language CML Scripting Language –Face Animation Scripting Head Movement tilt : movement in a slant with often subtle or superficial neck movement turn : require more profound movement of the neck

Toward a Unified Scripting Language 8 2. Scripting with the Character Markup Language Head Gesture Deictic Symbolic Face Movement and Gesture Define the elements and behaviors for specific parts of the face including the Brow, Gaze, and Mouse … …

Toward a Unified Scripting Language 9 2. Scripting with the Character Markup Language –Body Animation Scripting A set of defined body elements which are low level tags based on MPEG-4 BAPs A set of high-level tags representing body parts which are grouped from a set of respective low-level tags Base elements are.. Movement (moving, bending, turning) Gesture defines body postures that include motions representing common Iconic, Symbolic, or Deictic body gestures Posture (expression) defines a set of high-level tags representing general body gesture »Natural,Relax,Tense, Incline

Toward a Unified Scripting Language 10 CML Generation Animation Generation 2. Scripting with the Character Markup Language Generating Script State and Context Input Emotion Signal Behavior/Action Speech Text PDTD EDTD BDB FDB CML Processor Action Composition Utterance Composition Synchronisation CML Decoder CML Script FAP Audio/TTS FAP BAP Audio/TTS Face &Body

Toward a Unified Scripting Language Avatar Markup Language Objective : to design and develop a full end-to-end MPEG-4 multimedia framework to support, amongst other features, 3D avatar-based multi-user chat rooms and autonomous synthetic characters Three component –A database of basic facial and body animation units(can extended by third party) –A rendering system capable of merging multiple face and body animation units and text to speech input in realtime –A high-level scripting language designed to allow animators to specify which animations to use together with timing, priority, and decay information

Toward a Unified Scripting Language Avatar Markup Language AML Specification –Flexiblity : as many Expression Tracks as required, each containing as many Expressions as required …… … mm:ss:mmm ”name” …… …

Toward a Unified Scripting Language Avatar Markup Language … mm:ss:mmm|autosync|autoafter normal|slow|fast float, 0 to fn integer, 0 to n … … float, target’s X coordinate in meters float, target’s Y coordinate in meters float, target’s Z coordinate in meters … …

Toward a Unified Scripting Language CML and AML Applied CML –Sample CML – Happy Move and Point Script

Toward a Unified Scripting Language CML and AML Applied AML –Sample AML – Walk and Point ”Let me show you another phone over here” 25 00:06:000.\Expressions\.\Speech\ 00:00:800 smile.ex … … ….

Toward a Unified Scripting Language Discussion and Lessons Learned A Comparison

Toward a Unified Scripting Language Discussion and Lessons Learned Towards a Unified Language –Objectives of a Unified Language Define a framework to decouple embodied agent animation tools and the underlying affect and planning engines Establish a formal specification for unified /consistent interpretation Provide for modular development Create a markup language based on XML (providing semantic and scripting annotations) –Language Requirements High Level Usability Extensibility Parameterized Synchronization Support Consistency Domain

Toward a Unified Scripting Language Conclusion There is a trade-off to be made between higher levels of control (high granularity) and the complexity of the resulting language Striving toward a unified scripting and representation language may be the catalyst for much needed agreement