Download presentation
Presentation is loading. Please wait.
1
1/33 MARS PI Meeting 9/2003 Robot Learning From Human Demonstration Maja J Matarić Chad Jenkins, Marcelo Kallmann, Evan Drumwright, Nathan Miller, and Chi-Wei Chu University of Southern California Interaction Lab / Robotics Research Lab Center for Robotics and Embedded Systems (CRES) http://robotics.usc.edu/~agents/Mars2020/mars2020.html
2
2/33 MARS PI Meeting 9/2003 Motivation & Approach Goals: –Natural human-robot interaction in various domains –Automated robot programming & learning by imitation General Approach: –Use intrinsic behavior repertoire to facilitate control, human-robot interaction, and learning –Use human interactive training method (past work) –Use human data-driven programming & training methods
3
3/33 MARS PI Meeting 9/2003 Recent Progress Getting more & better training data: –a light-weight low-cost motion-capture mechanism Real-world validation of the method: –application of the method to Robonaut data Application of the method I: –synthesis of novel humanoid motion from automatically derived movement primitives Application of the method II: –movement classification, prediction, and imitation The next big problem: –humanoid motion planning around (dynamic) obstacles & validation on Robonaut
4
4/33 MARS PI Meeting 9/2003 Goal: Develop a low cost motion capture device capable of logging high-DOF human motion in an unstructured environment. Solution: Use filtered Inertial Measurement Units (IMUs) for 3 DOF tracking of each joint. Each sensor is developed at ~$300.00, resulting in a suit cost of ~$4200.00 to track 14 DOF Advantages: 1) Motion tracking is not coupled to off-person emitters/detectors, so can be used outdoors, anywhere 2) Sensors are small and networked, allowing various configurations 3) High bandwidth allows for real-time interaction with visualization and simulation tools IMU Motion Capture Suit Getting more & better training data
5
5/33 MARS PI Meeting 9/2003 -sensor location -onboard computer & battery Wireless Connection Human Body Model Sensor Network IMU Motion Capture Suit Getting more & better training data
6
6/33 MARS PI Meeting 9/2003 Specifications: Atmel 8 bit microcontroller w/ 10 bit ADC, 8 Mhz (3) 300 deg/sec Gyroscopes (3) 2-G Accelerometers ~ $200.00/sensor (2) DOF Filtered Next Revision: (3) Honeywell Magnetometers Change to 12 bit ADC, 16Mhz CPU ~ $260.00/sensor Full 3 DOF Filtered 1.5 “ Filter by Eric Bachmann @ MOVES Institute, Naval Postgraduate School REV 1 Suit Details Getting more & better training data
7
7/33 MARS PI Meeting 9/2003 Automatically Deriving Behaviors Input: kinematic motion; time series of joint angles Motion segmentation –Partition input motion into conceptually indivisible motion segments Grouping of behavior exemplars –Spatio-temporal Isomap dimension reduction and clustering Generalizing behaviors into forward models –Interpolation of a dense sampling for each behavior Meta-level exemplar grouping –Additional embedding iterations for higher level behaviors O. C. Jenkins, M. J Matarić, “Automated Derivation of Behavior Vocabularies for Autonomous Humanoid Motion", Autonomous Agents and Multiagent Systems, Melbourne, Australia, July 14-16, 2003. Recap of the method
8
8/33 MARS PI Meeting 9/2003 Applying the Method to Robonaut Work with Alan Peters, more tomorrow 80-D data from tactile and force sensors 5 tele-op grasps of a horizontal wrench –460 frames each, 2300 total Applied sequentially continuous ST-Isomap PCA Embedding, not informative ST-Isomap embedding ST-Isomap Distance Matrix Validation of the method on Robonaut
9
9/33 MARS PI Meeting 9/2003 Uncovering Structure in the Data ST-Isomap embedding Mapping of a new grasp motion onto the derived embedding; structure is retained Validation of the method on Robonaut Useful for monitoring performance, data analysis, generating controllers, etc.
10
10/33 MARS PI Meeting 9/2003 Using Derived Behaviors We now have a method for deriving vocabularies of behaviors from kinematic time-series of human motion –each primitive is a nonparametric exemplar-based motion model –each primitive can be eagerly evaluated to encode nonlinear dynamics in joint angle space We can use those derived behaviors for motion synthesis, prediction, and classification Our recent work applied the behaviors toward: –individually indexing to provide state prediction –providing desireds for control –matching against observed motion for classification and learning
11
11/33 MARS PI Meeting 9/2003 Forward Model Motion Synthesis Controller has a set of primitive behaviors Arbitrator decides which primitive to activate (e.g., based on transition probabilities) The active primitive incrementally updates the robot’s current pose (i.e., sets the desireds) Controller can generate motion indefinitely Use of the method: generating movement
12
12/33 MARS PI Meeting 9/2003 Blue: exemplar trajectories Black to Red: interpolated motion creating the temporal gradient flow field Right: 3 main PCs of a primitive flow field in joint angle space Representation of the Behaviors Behavior primitives are manifold-like flow fields in joint angle space, temporal ordering creates the flow field gradient This representation is a forward model, allowing for motion to be indexed, predicted, and synthesized dynamically, in real-time Model can generalize the exemplars to create novel motion Use of the method: generating movement
13
13/33 MARS PI Meeting 9/2003 Example: 1Primitive-Based Synthesis PCA-view of primitive flow field in joint angle space Resulting kinematic motion Use of the method: generating movement Three motions generated from the same primitive behavior, using different starting poses, showing flow and variation
14
14/33 MARS PI Meeting 9/2003 Example: 2 Primitive-Based Synthesis arm waving Motion generated by combining two primitives (wave-in and wave-out) with a high-level arbitrator that sequences their activation Use of the method: generating movement
15
15/33 MARS PI Meeting 9/2003 Examples: 78 Beh.-Based Synthesis Single activity reaching (no root info) Multi-activity (take 2) cabbage patch → twist Multi-activity (take 1) cabbage patch → twist Use of the method: generating movement
16
16/33 MARS PI Meeting 9/2003 Synthesis from Isolated Activities cabbage patch (20000 frames) combined punching (3400 frames)jab punching (view 2) jab punching (5000 frames) Use of the method: generating movement
17
17/33 MARS PI Meeting 9/2003 Behavior Classification & Imitation Goal: use the primitive behaviors to recognize, classify, predict, and imitate/reconstruct observed movement Compare observed motion with predictions from behavior primitives –Use Euclidean distance between end-effector positions as a metric –Use a Bayesian classifier Reconstruct/imitate the observed movement –Concatenate best match trajectories from classified primitives to reconstruct/imitate Use of the method: classifying movement
18
18/33 MARS PI Meeting 9/2003 Classification & Imitation Schematic Use of the method: classifying movement
19
19/33 MARS PI Meeting 9/2003 Example: “Yo-yo” Imitation Observed “yo-yo” motion (from MegaMocap V2) “yo-yo” reconstruction from waving “yo-yo” from punching “yo-yo” from the twist “yo-yo” from cabbage patch Use of the method: imitating movement
20
20/33 MARS PI Meeting 9/2003 Details of “Yo-yo” Reconstruction Waving vocabulary contains 2 primitives –dark red (wave down) and light red (wave up) Predicted end-effector location is matched against observed end- effector location –green (current trajectory horizon) Use of the method: classifying movement
21
21/33 MARS PI Meeting 9/2003 Behavior primitives are models Can use the flow-field representation or, in this case, radial basis functions were used We can apply a Bayesian classifier : P(C|X) = P(X|C)*P(C) C is a class (behavior) X is an observation (joint angles) P(X|C) can be determined from the primitives Classifier operates in real-time on joint-angle data Applications: human avoidance, interactive tasks with human operators and collaborators and/or other robots Probabilistic Behavior Classification E. Drumwright, M. J Matarić, “Generating and Recognizing Free-Space movements in Humanoid Robots", IEEE/RSJ Int. Conf. on Intelligent Robotics and Systems (IROS-2003), Las Vegas, Nevada, Oct 25-30, 2003. Use of the method: classifying movement
22
22/33 MARS PI Meeting 9/2003 - Model is a distribution of joint angles over time (below) - Actual distribution is multivariate (variables = DOF used by primitive behaviors) Mixture spaces between 2 exemplars of the jab primitive for (left) one shoulder DOF and (right) 2 nd shoulder DOF Bayesian Behavior Classification Use of the method: classifying movement
23
23/33 MARS PI Meeting 9/2003 Bayesian Classification Results DatasetDescription% error Primitive movements 50 non-exemplar instances of primitives executed on physically simulated humanoid 3.39 Motion capture and animation data 550 movements from animation and motion capture 0.03 Use of the method: classifying movement Classification of novel movement is highly accurate
24
24/33 MARS PI Meeting 9/2003 Humanoid Motion Planning Goal: –Synthesize real-time humanoid collision-free motion in dynamic environments Approach: –Use demonstrated motion data to compute a meaningful representation of valid motions –This enables: - fast determination of collision-free paths - adaptation to new obstacles Next problem: humanoid motion planning
25
25/33 MARS PI Meeting 9/2003 Humanoid Motion Planning Approach –Use pre-computed probabilistic roadmaps to represent the valid motion space of the humanoid –Temporarily disable parts of the roadmap that are invalid when obstacles are perceived. If the remaining part is not enough, perform on-line planning. Contribution –Introduction of dynamic roadmaps for motion planning, joining the advantages of multi-query methods (PRMs, PRTs, VGs) and single-query methods (RRTs, Exp. Spaces, SBLs). –Solutions for the humanoid case, e.g., the use of demonstrated motions to construct suitable roadmaps Next problem: humanoid motion planning
26
26/33 MARS PI Meeting 9/2003 Details of the Approach (1/3) Roadmap computation: –In high dimensional configuration space, comprising both arms and torso –Pre-computed using PRM sampling –Use density limits to achieve uniform sampling of end-effectors positions in the reachable workspace –Sample postures in the subspace covered by the demonstrated data (current work) –Even without considering obstacles the roadmap is useful for deriving motions without self-collisions 22 DOFs 17 DOFs Next problem: humanoid motion planning
27
27/33 MARS PI Meeting 9/2003 Details of the Approach (2/3) On-line roadmap maintenance –When obstacles are detected, invalid edges and nodes are disabled –Workspace cell decomposition is used for fast localization of invalid nodes and edges –The time required to update the roadmap depends on the complexity of the environment and robot (collision detection) –Trade-offs with on-line planning: roadmap update is not suitable to highly dynamic environments, but fine for pick and place applications Next problem: humanoid motion planning
28
28/33 MARS PI Meeting 9/2003 Details of the Approach (3/3) On-line query –A*-like graph search quickly finds a path to the nearest node of the goal posture –If the node cannot be directly connected to the goal posture, on-line single-query planning is used (currently using bi- directional RRTs) –Better results when few portions of the roadmap are invalidated –Worst cases achieve similar results to using single-query planning alone Next problem: humanoid motion planning
29
29/33 MARS PI Meeting 9/2003 Validation: Results With Robonaut Example motions –Visualization geometry: 23930 triangles –Collision geometry: 1016 triangles –Optimization (smoothing) takes about 0.3s (Pentium III 2.8 GHz) Next problem: humanoid motion planning
30
30/33 MARS PI Meeting 9/2003 Path Optimization (1/2) Incremental path linearization –Simple and efficient in most cases –May be time-consuming as collision detection must be invoked before each local linearization. Next problem: humanoid motion planning
31
31/33 MARS PI Meeting 9/2003 Path Optimization (2/2) Incremental path linearization –Simple and efficient in most cases –May be time-consuming as collision detection must be invoked before each local linearization. –Sub-configuration linearization may be required, e.g., to decouple arm motions Next problem: humanoid motion planning
32
32/33 MARS PI Meeting 9/2003 Summary Getting more & better data: –A light-weight low-cost motion-capture mechanism Real-world validation of the method: –Successful application of the method to Robonaut data Application of the method I: –Successful synthesis of novel humanoid motion from automatically derived movement primitives Application of the method II: –Efficient movement classification, prediction, and imitation The next big problem: –Humanoid motion planning around obstacles validated on Robonaut, dynamic obstacles to be addressed next
33
33/33 MARS PI Meeting 9/2003 Contributors and More Info Additional info, papers, videos: http://robotics.usc.edu/~agents/Mars2020/mars2020.html Chad Jenkins & Nathan Miller Evan Drumwright Marcelo Kallmann Chi-Wei Chu
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.