Jivko Sinapov, Kaijen Hsiao and Radu Bogdan Rusu Proprioceptive Perception for Object Weight Classification
What is Proprioception? “It is the sense that indicates whether the body is moving with required effort, as well as where the various parts of the body are located in relation to each other.” - Wikipedia
3 Why Proprioception?
4 Full Empty
Hard Softvs Why Proprioception?
Lifting: gravity, effort, etc.
Pushing: friction, mass, etc.
Squeezing: compliance, flexibility
Power, “Play and Exploration in Children and Animals”, 2000
Related Work: Proprioception “Learning Haptic Representations of Objects”: [ Natale et al (2004) ]
Related Work: Proprioception Proprioceptive Object Recognition [ Bergquist et al (2009) ]
Perception Problem for PR2: Is the bottle full or empty?
General Approach Let the robot experience what full and empty bottles “feel” like Use prior experience to classify new bottles as either full or empty
Behavior: Power, “Play and Exploration in Children and Animals”, 2000
Behaviors 1) Unsupported Holding2) Lifting
Data Representation Behavior Execution: [J i, E i, C i ] Recorded Data: Joint Positions Efforts Class Label {full, empty}
Example Recorded Joint Efforts of Left Arm:
Classification Procedure [J i, E i, ?] Feature Extraction Recognition Model Pr( ‘full’ )Pr( ‘empty’ )
Recognition Model X =[J i, E i, ?] Recognition Model
X =[J i, E i, ?] Recognition Model Find N closest neighbors to X in joint-feature space
Recognition Model X =[J i, E i, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train classifier C on the N neighbors that maps effort features to class label
Recognition Model X =[J i, E i, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train classifier C on the N neighbors that maps effort features to class label Use trained classifier C to label X
Objects: Procedure: Place object on table Robot grasps it and performs the current behavior (either hold or lift) in a random position in space Robot puts object back down on table in random position; repeat. Each behavior performed 100 times on each bottle in both full and empty states A total of 2 x 5 x 100 x 2 = 2000 behavior executions Training Procedure
Evaluation 5 fold cross-validation: at each iteration, data with 4 out of the five bottles is used for training, and the rest used for testing Three classification algorithms evaluated: K-Nearest Neighbors Support Vector Machine (quadratic kernel) C4.5 Tree
Chance Accuracy: 50%
Can the robot boost recognition rate by applying a behavior multiple times?
How much training data is necessary?
(lift)
Application to Regression X =[J i, E i, ?] Recognition Model Find N closest neighbors to X in joint-feature space Train regression model C on the N neighbors that maps effort features to class label Use trained regression model C to label X
Regression Results
Mean Abs. Error = lbs
Regression Results Mean Abs. Error = lbs Chance error = lbs
Application to Sorting Task Sorting task: Place empty bottles in trash Move full bottles on other side of table
Application to Sorting Task
Sorting Task: video
Application to a new recognition task Full or empty?
Behavior: 40 trials with full box and 40 trials with empty box Recognition Accuracy: % (all three algorithms) slide object across table
Sliding task: video
Conclusion Behavior-grounded approach to proprioceptive perception Implemented as a ROS package: - This work has been submitted to ICRA 2011.
Future Work More advanced proprioceptive feature extraction Multi-modal object perception: Auditory 3D Tactile