Download presentation
Presentation is loading. Please wait.
Published byΕυφροσύνη Τερψιχόρη Αθανασίου Modified over 6 years ago
1
Movement Imitation: Linking Perception and Action
Advanced Topics in Computer Vision, 2004 Lior Noy Department of Computer Science and Applied Mathematics Weizmann Institute of Science
2
Movement Imitation - Example
3
Imitation: Linking Perception and Action
semantic world (objects, actions) Perception Imitation Action realm of raw-data (pixels, muscles activation)
4
Outline 2. Programming By Demonstration 1. Movement Imitation
3. Robotic Movement Imitation Primitives Based Approach (Mataric’) Real Time Tracking (“mirror-game”) (Ude et al.) 4. Direct Perception and Imitation
5
A Variety of Probes into Imitation
Developmental psychology Ethology Imitation Human Brain Imaging Cognitive psychology Neurophysiology Robotics
6
Evaluating Imitation Robot Following in a Hilly Environment
7
Evaluating Imitation
8
Evaluating Imitation
9
Programming By Demonstration (PbD)
Methods to program a robot Human Programming Reinforcement Learning Programming by Demonstration
10
Programming By Demonstration (PbD) Applications
Navigation Locomotion Playing air-hockey Manipulating blocks Balancing a pole Hitting a tennis-serve Grasping unfamiliar objects Imitating dancing movement Making the problem easier: A specific task Limited visual input (e.g., only one model) A Helpful teacher
11
PbD – Application Example
The “Golden Maze”
12
PbD – Application Example
Playing Air-Hockey
13
PbD – Application Example
Box Manipulations
14
Three Approaches for PbD
Symbolic Control-Based Statistical
15
Symbolic Approach for PbD
Analyze observed actions in terms of sub-goals Match actions needed to fulfill these sub-goals Create a symbolic description of the environment ( ”object A is above object B” ) Learn a series of symbolic if-then rules ( ”if object A is above object B then grasp-object[ object B ]” )
16
Example: Symbolic Approach for PbD
(Kunyushi et al., 1994) … but how do you symbolically describes “hitting a tennis serve”?
17
Control-Based Approach for PbD
No symbolic parsing of perceived actions Assume a pre-defined control policy Acquire needed parameters from observation
18
Control-Based Approach Inverse Models
Sometime assume known inverse models (converting desired effect to needed commands) motor commands joint angles end-effector position Forward Models Inverse Models
19
Example: Control-Based Approach for PbD
(Schaal, 2003) Tennis movie
20
Statistical Approach for PbD
No prior assumption on used control policy Statistically match perception and action Can this be done? More on this later…
21
Example: Statistical Approach for PbD
PCA (Asada, 1995)
22
Example: Statistical Approach for PbD
Learning: Perform random action A(i) Record resulted optical flow f(i) Compute principal-component p1(i), p2(i) Learn the connection A(i) – {p1(i), p2(i)}
23
Outline 2. Programming By Demonstration 1. Movement Imitation
3. Robotic Movement Imitation Primitives Based Approach (Mataric’) Real Time Tracking (“mirror-game”) (Ude et al.) 4. Direct Perception and Imitation
24
PbD for Movement Imitation Pre-Cursor 1: Cartoons Retargeting
(Bregler et al., 2002)
25
Cartoons Retargeting Two Types of Deformations
Affine deformation Key shape deformation y x
26
Cartoons Retargeting - Results
More on: “Animating human motion”, Speakers : Simon Adar, Yoram Atir
27
PbD for Movement Imitation Pre-Cursor 2: Guided Movement Synthesis
(Zelnik-Manor, Hassner & Irani, 2004)
28
Event-Based Analysis of Video
(Zelnik-Manor & Irani, 2001)
29
Guided Movement Synthesis (a.k.a. “Movement Imitation”?)
30
PbD for Movement Imitation Pre-Cursor 2: Movement Synthesis
31
PbD for Movement Imitation Case Study: Primitive-Based Approach
The Problem: How to convert visual input to motor output? X1 X Xn J1 J Jm = movement primitive1 movement primitive2 ... movement primitive K A Possible Solution: Use a common, sparse representation: sensory-motor primitives. … but what primitives to use?
32
Movement Imitation Using Sensory-Motor Primitives
Sequences of action that accomplish a complete goal-directed behavior. Examples: 1. Move hand in “straight line”, “parabola” (Felix…). 2. Perform “grasping”, “a tennis serve”.
33
Imitation Learning Using Sensory-Motor Primitives
(Schaal, Ijspeert & Billard, 2003)
34
Inspiration for Using Sensory-Motor Primitives
Evidence for: Coding of goal-directed actions. Shared representations of perception and action. Example – Mirror Neurons. (Rizzolatti et al., 2002; Gallese et al. 1996)
35
Movement Imitation Using Sensory-Motor Primitives
General Principles: Selective attention focusing on end-points movements. Sensory-motor primitives as integrative representation. Learning new skills as compositions of primitives. Experimental test-beds. (Mataric’,1998) Inverse models – transform end-point movements to joint movements. Eye tracking experiments – people focus on end-points in imitation.
36
What Sensory-Motor Primitives to Use?
Innate Pre-defined control policies (e.g., central pattern generators) Learned Un-supervised clustering (using PCA, Isomap ) Primitives Inverse models – transform end-point movements to joint movements. Eye tracking experiments – people focus on end-points in imitation. End-Points Space (“visual space”) Joints Space (“motor space”)
37
“Experiment in Imitation Using Perceptuo-Motor Primitives”, (Weber, Jenkins & Mataric’,2001)
Extract hand (end-point) movements. Perform Vector-Quantization to get invariant representation.
38
“Experiment in Imitation Using Perceptuo-Motor Primitives”
Classify movement to primitives (line, arc, circle). Group adjacent similar primitives.
39
“Experiment in Imitation Using Perceptuo-Motor Primitives”
Determine primitives parameters. Project to ego-centric space.
40
“Experiment in Imitation Using Perceptuo-Motor Primitives”
(Weber, Jenkins & Mataric’,2001)
41
PbD for Movement Imitation Case Study: Real-Time Tracker
The Goal: Mimic movements in real-time The Problem: Large amount of data to process (6 MB/Sec) Need “continuous success” The Solution: Probabilistic approach to prevent excessive data interactions (Ude et al.,2001)
42
“Real-Time Visual System for Interaction with Humanoid Robot” (Ude, Shibata & Atkeson, 2001)
Estimate positions of tracked “blobs” in the image Compute 3D coordinates of tracked objects using stereo Transform into via-points for robot hand trajectory Compute motor commands from desired trajectory
43
Real-Time Tracker Tracking “Blobs” In a Bayesian Setting
probability for the pixel at location u to have Intensity Iu Given the process k a-priori probability for process k
44
Real-Time Tracking Minimize Log-Likelihood
overall probability to observe image I Goal: determine the parameters that are most likely to produce this image – Maximal Likelihood Problem. computationally easier to minimize the negative log likelihood
45
Real-Time Tracking Minimize Log-Likelihood
Find minimum (using Lagrange Multipliers) and get: probability that pixel u stems from process l
46
Real-Time Tracking Find Probabilities Parameters
The above equations are solved iteratively by the Expectation-Minimization (EM) algorithm Expectation stage: compute Pu,l using the current estimate for Ө and ω. Minimization stage: compute new Ө and ω assuming Pu,l are constant. from probabilities of pixels to belong to a certain process (e.g. – the human hand) …
47
“Real-Time Visual System for Interaction with Humanoid Robot”
… to object locations
48
Real-Time Tracking General Stages
Estimate positions of tracked “blobs” in the image Compute 3D coordinates of tracked objects using stereo Transform into via-points for robot hand trajectory Compute motor commands from desired trajectory
49
Real-Time Tracking Estimate Trajectories with B-splines
50
Real-Time Tracking - Results
Robot Compliance Movie
51
References “Vision-Based Robot Learning for Behavior Acquisition” M. Asada, T. Nakamura, and K. Hosoda. Proc. of IEEE International Conference on Intelligent Robots And Systems (IROS '95) Workshop on Vision for Robots, pp , 1995. “Turning to the masters: Motion capturing cartoons” Bregler C, Loeb L, Chuang E, Deshpande H ACM TRANSACTIONS ON GRAPHICS 21 (3): JUL 2002 “Movement, activity and action: The role of knowledge in the perception of motion” Bobick AF PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY OF LONDON SERIES B-BIOLOGICAL SCIENCES 352 (1358): AUG “Challenges in Building Robots That Imitate People”, Breazeal C. and Scassellati B, in "Imitation in Animals and Artifacts", Kerstin Dautenhahn and Chrystopher Nehaniv, eds. The MIT Press, 2002. “Action recognition in the premotor cortex” Gallese V, Fadiga L, Fogassi L, Rizzolatti G BRAIN , 119: Part 2 APR 1996 “Learning by watching - extracting reusable task knowledge from visual observation of human-performance” Kuniyoshi Y, Inaba M, Inoue H IEEE TRANSACTIONS ON ROBOTICS AND AUTOMATION, 10 (6): DEC 1994 “Sensory-Motor Primitives as a Basis for Learning by Imitation: Linking Perception to Action and Biology to Robotics.” Maja J Mataric, in "Imitation in Animals and Artifacts", Kerstin Dautenhahn and Chrystopher Nehaniv, eds., MIT Press, 2002,
52
References “From mirror neurons to imitation: facts and speculations”, Rizzolatti G, Fadiga L, Fogassi L and Gallese V, in: Meltzoff AN and Prinz W (Eds.) "The imitative mind: development, evolution, and brain bases", New York: Cambridge University Press, 2002 “Computational approaches to motor learning by imitation” Schaal S, Ijspeert A, Billard A PHILOSOPHICAL TRANSACTIONS OF THE ROYAL SOCIETY OF LONDON SERIES B-BIOLOGICAL SCIENCES, 358 (1431): MAR “Movement planning and imitation by shaping nonlinear attractors” Schaal S, PROCEEDINGS OF THE 12TH YALE WORKSHOP ON ADAPTIVE AND LEARNING SYSTEMS 2003 “Robots that imitate humans” Scassellati B. Breazeal C. Trends in Cognitive Science, 6(11): , November 2002. “Real-time visual system for interaction with a humanoid robot”, Ude A., Shibata T. and Atkeson C. G., Robotics and Autonomous Systems, 37: , 2001. Stefan Weber, Odest C. Jenkins, and Maja J. Mataric´. "Imitation Using Perceptual and Motor Primitives". In International Conference on Autonomous Agents, pages , Barcelona, Spain, Jun 2000 "Event-Based Analysis of Video “, Zelnik-Manor L. and Irani M., IEEE CONFERENCE ON COMPUTER VISION AND PATTERN RECOGNITION, December 2001 (CVPR'01).
53
Perception? Action? “The great end of life is not knowledge but action.” (Thomas H. Huxley)
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.