Download presentation
Presentation is loading. Please wait.
Published byDiana Cole Modified over 9 years ago
1
Full-body motion analysis for animating expressive, socially-attuned agents Elisabetta Bevacqua Paris8 Ginevra Castellano DIST Maurizio Mancini Paris8 Chris Peters Paris8
2
People involved DIST - full-body movement and gesture analysis Paris8 - Agent processing and behavior
3
Overview Scenario: agent that senses, interprets and copies a range of full-body movements from a person in the real world System able to - acquire input from a video camera - process information related to the expressivity of human movement - generate copying behaviours Towards a system that recognizes emotions of users from human movement and an expressive agent that shows empathy to them
4
General framework E. Bevacqua, A. Raouzaiou, C. Peters, G. Caridakis, K. Karpouzis, C. Pelachaud, M. Mancini, Multimodal sensing, interpretation and copying of movements by a virtual agent, PIT 2006. Encompasses domains of: –Sensing –Interpretation –Planning –Generation
5
The application From human motion to behaviour generation of expressive agents Full-body motion analysis of a dancer - real and virtual world Agent’s response to expressive human motion descriptors - quantity of motion - contraction/expansion Copying behaviour
6
Part 1. Sensing and analysis Real world Analysis –Computer vision techniques –Facial analysis –Gesture analysis –Full-body analysis Ambition: ‘switchable’ sensing –Real-world and virtual environment –Bridge gap between ECA and embedded virtual agents
7
Full-body analysis Expressive cues from human full-body movement –Real motion –Virtual motion Global indicators EyesWeb Expressive Gesture Processing Library* –MotionAnalysis: motion trackers (e.g., LK), movement expressive cues (QoM, CI,...). –TrajectoryProcessing: processing of 2D (physical or abstract) trajectories (e.g., kinematics, directness, …) –SpaceAnalysis *Camurri, A., Mazzarino, B. and Volpe, G., Analysis of Expressive Gesture: The Eyesweb Expressive Gesture Processing Library, in A. Camurri, G.Volpe (Eds.), “Gesture-based Communication in Human- Computer Interaction ”, LNAI 2915, Springer Verlag, 2004.
8
SMI and Quantity of Motion Quantity of Motion is an approximation of the amount of detected movement, based on Silhouette Motion Images QoM = Area(SMI[t, n])/Area(Silhouette[t])
9
A measure, ranging from 0 to 1, of how the dancer’s body uses the space surrounding it It can be calculated using a technique related to the bounding region, i.e., the minimum rectangle surrounding the dancer’s body: the algorithm compares the area covered by this rectangle with the area currently covered by the silhouette Contraction Index
10
Full-body analysis: examples in the real world and in the virtual environment (I) Analysis of quantity of motion and contraction index with EyesWeb (G. Castellano, C. Peters, Full-body analysis of real and virtual human motion for animating expressive agents, HUMAINE Presentation, Athens 2006) Real world and virtual environment Switchable sensing: analysis algorithms capable of - handling input from real-world video stream and from virtual data - providing similar results
11
Full-body analysis: examples in the real world and in the virtual environment (II)
12
Comparison of metrics: contraction index
13
Comparison of metrics: quantity of movement
14
Part 2. Interpretation and Behaviour Ideal goal: What do we use the expressive cues for? –Planning how to behave according to users’ quality of gesture In this work: Copying dancer’s quality of gesture
15
Analysis of gesture data Full-body analysis of a dancer Manual segmentation of dancer’s gestures Mean value of the quantity of motion and the contraction index of the dancer for each gesture
16
CI & QoM Copying Greta performs one gesture type (same shape) but copies the gesture quality of movement of the dancer Greta uses expressivity parameters to modulate the quality of her gestures Mapping expressive cues to expressivity parameters: »CI Spatial extent »QoM Temporal extent
17
Parameters scaling
18
Copying: an example Video of dancer moving and virtual agent performing gestures copying quality of the dancer motion DEMO!
19
Facial expressions (1) Show emotional facial expressions depending on users’ quality movement Study the relation between quality of movement and emotion Example: Link QoM and CI to threat:
20
Facial expressions (2) Example: Link QoM and CI to empathy:
21
Future Preliminary work Validation both for analysis and synthesis –Perceptive tests to study how users associate an emotional label to an expressive behaviour Towards a virtual agent able to recognize users’ emotions from their movement and to show empathy Real-time system with continuous input
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.