Presentation is loading. Please wait.

Presentation is loading. Please wait.

Expressive Gestures for NAO NAO TechDay, 13/06/2012, Paris Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France.

Similar presentations


Presentation on theme: "Expressive Gestures for NAO NAO TechDay, 13/06/2012, Paris Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France."— Presentation transcript:

1 Expressive Gestures for NAO NAO TechDay, 13/06/2012, Paris Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France

2 Le Quoc Anh & Catherine Pelachaud page 1 Objectives Generate communicative gestures for Nao robot Integrated within an existing platform for virtual agent Nonverbal behaviors described symbolically Synchronization (gestures and speech) Expressivity of gestures GVLEX project (Gesture & Voice for Expressive Reading) Robot tells a story expressively. Partners : LIMSI (linguistic aspects), Aldebaran (robotics), Acapela (speech synthesis), Telecom ParisTech (expressive gestures) NAO TechDay 2012

3 Le Quoc Anh & Catherine Pelachaud State of the art Several initiatives recently: Salem et Kopp (2012): robot ASIMO, the virtual framework MAX, gesture description with MURML. Aaron Holroyd et Charles Rich (2011): robot Melvin, motion scripts with BML, simple gestures, feedback to synchronize gestures and speech Ng-Thow-Hing et al. (2010): robot ASIMO, gestures selection, synchronization between gestures and speech. Nozawa et al. (2006): motion scripts with MPML-HP, robot HOAP-1 Our system: Focus on expressivity and synchronization of gestures with speech using a common platform for Greta and for Nao page 2 NAO TechDay 2012

4 Le Quoc Anh & Catherine Pelachaud Steps 1. Build a library of gestures from a corpus of storytelling video: the gesture shapes should not be identical (between the human, virtual agent, robot) but they have to convey the same meaning. 2. Use the GRETA system to generate gestures for Nao Following the SAIBA framework -Two representation languages: FML (Function Markup Language) and BML (Behavior Markup Language) -Three separated modules: plan communicative intents, select and plan gestures, and realize gestures page 3 NAO TechDay 2012 Text Intent Planning Behavior Planning Behavior Realizer FML BML Behavior Realizer GRETA System

5 Le Quoc Anh & Catherine Pelachaud Global diagram page 4 NAO TechDay 2012 FML BML KEYFRAMES LEXICON Gesture Selection Planification of gesture duration Synchronisation with AI speech Modification of gesture expressivity

6 Le Quoc Anh & Catherine Pelachaud Gesture Animation Planning Synchronization with speech The stroke phase coincides or precedes emphasized words of the speech (McNeill, 1992) Gesture stroke phase timing specified by synch points Expressivity of gestures The same prototype but different animations Parameters: -Spatial Extent (SPC): Amplitude of movement -Temporal Extent (TMP): Speed of movement -Power (PWR): Acceleration of movement -Repetition (REP): Number of Stroke times -Fluidity (FLD): Smoothness and Continuity -Stiffness (STF): Tension/Flexibility page 5 NAO TechDay 2012

7 Le Quoc Anh & Catherine Pelachaud Example page 6 NAO TechDay 2012 keyframe1] keyframe[2] keyframe[3] <speech id="s1" start="0.0“ \vce=speaker=Antoine\ \spd=180\ Et le troisième dit tristement: \vce=speaker=AntoineSad\ \spd=90\ \pau=200\ J'ai très faim! <gesture id="beat_hungry" start="s1:tm1" end=“start+1.5" stroke="0.5"> 0 0 -1.0 0 -0.3 -0.2 YCC XCenter Zmiddle OPENHAND INWARD YLowerEP XCenter ZNear OPEN INWARD

8 Le Quoc Anh & Catherine Pelachaud Compilation page 7 NAO TechDay 2012 BML Realizer API.AngleInterpolation (joints, values,times) Send timed key-positions to the robot using available APIs Animation is obtained by interpolating between joint values with robot built-in proprietary procedures.

9 Le Quoc Anh & Catherine Pelachaud Demo « Trois petits morceaux de nuit » page 8 NAO TechDay 2012

10 Le Quoc Anh & Catherine Pelachaud Conclusion A gesture model is designed, implemented for Nao while taking into account physical constraints of the robot. Common platform for both virtual agent and robot Expressivity model Future work Create gestures with different emotional colour and personal style Validate the model through perceptive evaluations page 9 NAO TechDay 2012

11 Le Quoc Anh & Catherine Pelachaud Acknowledgment page 10 NAO TechDay 2012 This work has been funded by the ANR GVLEX project It is supported from members of the laboratory TSI, Telecom-ParisTech


Download ppt "Expressive Gestures for NAO NAO TechDay, 13/06/2012, Paris Le Quoc Anh - Catherine Pelachaud CNRS, LTCI, Telecom-ParisTech, France."

Similar presentations


Ads by Google