Manipulation in Human Environments

Slides:



Advertisements
Similar presentations
Feature extraction: Corners
Advertisements

RGB-D object recognition and localization with clutter and occlusions Federico Tombari, Samuele Salti, Luigi Di Stefano Computer Vision Lab – University.
Perception and Perspective in Robotics Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group Goal To build.
System Integration and Experimental Results Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash.
Where has all the data gone? In a complex system such as Metalman, the interaction of various components can generate unwanted dynamics such as dead time.
Bryan Willimon Master’s Thesis Defense Interactive Perception for Cluttered Environments.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
ECE 7340: Building Intelligent Robots QUALITATIVE NAVIGATION FOR MOBILE ROBOTS Tod S. Levitt Daryl T. Lawton Presented by: Aniket Samant.
Development of Vision-Based Navigation and Manipulation for a Robotic Wheelchair Katherine Tsui University of Massachusetts, Lowell.
L ABORATORY FOR P ERCEPTUAL R OBOTICS U NIVERSITY OF M ASSACHUSETTS A MHERST D EPARTMENT OF C OMPUTER S CIENCE Intent Recognition as a Basis for Imitation.
Humanoid Robotics – A Social Interaction CS 575 ::: Spring 2007 Guided By Prof. Baparao By Devangi Patel reprogrammable multifunctionalmovable self - contained.
CS335 Principles of Multimedia Systems Multimedia and Human Computer Interfaces Hao Jiang Computer Science Department Boston College Nov. 20, 2007.
Lecture 3a: Feature detection and matching CS6670: Computer Vision Noah Snavely.
 For many years human being has been trying to recreate the complex mechanisms that human body forms & to copy or imitate human systems  As a result.
Sociable Machines Cynthia Breazeal MIT Media Lab Robotic Presence Group.
Presented by Gal Peleg CSCI2950-Z, Brown University February 8, 2010 BY CHARLES C. KEMP, AARON EDSINGER, AND EDUARDO TORRES-JARA (March 2007) 1 IEEE Robotics.
Biointelligence Laboratory School of Computer Science and Engineering Seoul National University Cognitive Robots © 2014, SNU CSE Biointelligence Lab.,
A Brief Overview of Computer Vision Jinxiang Chai.
Multimedia Specification Design and Production 2013 / Semester 2 / week 8 Lecturer: Dr. Nikos Gazepidis
Twendy-One Presented by: Brandon Norton Robot Designed by: WASEDA University Sugano Laboratory, 2009.
The Whole World in Your Hand: Active and Interactive Segmentation The Whole World in Your Hand: Active and Interactive Segmentation – Artur Arsenio, Paul.
Building Humanoid Robots Our quest to create intelligent machines Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics.
Vrobotics I. DeSouza, I. Jookhun, R. Mete, J. Timbreza, Z. Hossain Group 3 “Helping people reach further”
12 November 2009, UT Austin, CS Department Control of Humanoid Robots Luis Sentis, Ph.D. Personal robotics Guidance of gait.
The Robonaut Josh Kuhn. What is Robonaut?  A dexterous, humanoid robot  Developed at NASA’s Johnson Space Center (JSC) with support from GM  Designed.
Natural Tasking of Robots Based on Human Interaction Cues Brian Scassellati, Bryan Adams, Aaron Edsinger, Matthew Marjanovic MIT Artificial Intelligence.
Chapter 5 Multi-Cue 3D Model- Based Object Tracking Geoffrey Taylor Lindsay Kleeman Intelligent Robotics Research Centre (IRRC) Department of Electrical.
Object Lesson: Discovering and Learning to Recognize Objects Object Lesson: Discovering and Learning to Recognize Objects – Paul Fitzpatrick – MIT CSAIL.
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Exploiting cross-modal rhythm for robot perception of objects Artur M. Arsenio Paul Fitzpatrick MIT Computer Science and Artificial Intelligence Laboratory.
DARPA Mobile Autonomous Robot Software BAA99-09 July 1999 Natural Tasking of Robots Based on Human Interaction Cues Cynthia Breazeal Rodney Brooks Brian.
Features Jan-Michael Frahm.
Giorgio Metta · Paul Fitzpatrick Humanoid Robotics Group MIT AI Lab Better Vision through Manipulation Manipulation Manipulation.
Robotics/Machine Vision Robert Love, Venkat Jayaraman July 17, 2008 SSTP Seminar – Lecture 7.
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Duo: Towards a Wearable System that Learns about Everyday Objects and Actions Charles C. Kemp MIT CSAIL ● Goal: help machines learn an important form of.
Design of a Compliant and Force Sensing Hand for a Humanoid Robot Aaron Edsinger-Gonzales MIT Computer Science and Artificial Intelligence Laboratory.
Feel the beat: using cross-modal rhythm to integrate perception of objects, others, and self Paul Fitzpatrick and Artur M. Arsenio CSAIL, MIT.
1 Review and Summary We have covered a LOT of material, spending more time and more detail on 2D image segmentation and analysis, but hopefully giving.
1 Robonaut: A Humanoid Robotic Assistant for On-Orbit and Planetary Missions Nicolaus Radford Automation, Robotics and Simulation Division NASA/Johnson.
AN ACTIVE VISION APPROACH TO OBJECT SEGMENTATION – Paul Fitzpatrick – MIT CSAIL.
Toward humanoid manipulation in human-centered environments T. Asfour, P. Azad, N. Vahrenkamp, K. Regenstein, A. Bierbaum, K. Welke, J. Schroder, R. Dillmann.
University of Pennsylvania 1 GRASP Control of Multiple Autonomous Robot Systems Vijay Kumar Camillo Taylor Aveek Das Guilherme Pereira John Spletzer GRASP.
A Modular Robotic Concept for Human/Robot Interaction and Planetary Exploration Issa A.D. Nesnas, Ph.D. Daniel Helmick Ayanna Howard Mobility Concept Development.
Manipulation in Human Environments
First-person Teleoperation of Humanoid Robots
San Diego May 22, 2013 Giovanni Saponaro Giampiero Salvi
Andreas Hermann, Felix Mauch, Sebastian Klemm, Arne Roennau
TP12 - Local features: detection and description
Multimodal Registration Using Stereo Imaging and Contact Sensing
Recognizing Deformable Shapes
Gateway Coalition - WSU Rahul K. Shah
Direct Manipulator Kinematics
Humanoid Robotics – A Social Interaction
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Life in the Humanoid Robotics Group MIT AI Lab
Jörg Stückler, imageMax Schwarz and Sven Behnke*
Properties of human stereo processing
Stereo Vision Applications
Developing systems with advanced perception, cognition, and interaction capabilities for learning a robotic assembly in one day Dr. Dimitrios Tzovaras.
Learning about Objects
Mixed Reality Server under Robot Operating System
CSE 455 – Guest Lectures 3 lectures Contact Interest points 1
Better Vision through Experimental Manipulation
Brief Review of Recognition + Context
By Cecily flemate Freson pacific University
Domo: Manipulation for Partner Robots Aaron Edsinger MIT Computer Science and Artificial Intelligence Laboratory Humanoid Robotics Group
Introduction to Robotics
Chapter 4 . Trajectory planning and Inverse kinematics
Lecture 3. Virtual Worlds : Representation,Creation and Simulation ( II ) 고려대학교 그래픽스 연구실.
Presentation transcript:

Manipulation in Human Environments Aaron Edsinger & Charlie Kemp Humanoid Robotics Group MIT CSAIL

Domo 29 DOF 6 DOF Series Elastic Actuator (SEA) arms 4 DOF SEA hands 2 DOF SEA neck Active vision head Stereo cameras Gyroscope Sense joint angle + torque 15 node Linux cluster

Manipulation in Human Environments Human environments are designed to match our cognitive and physical abilities Work with everyday objects Collaborate with people Perform useful tasks

Applications Aging in place Cooperative manufacturing Household chores

Three Themes Let the body do the thinking Collaborative manipulation Task relevant features

Let the Body do the Thinking Design Passive compliance Force control Human morphology

Let the Body do the Thinking Compensatory behaviors Reduce uncertainty Modulate arm stiffness Aid perception (motion, visibility) Test assumptions (explore)

Let the Body Do the Thinking

Collaborative Manipulation Complementary actions Person can simplify perception and action for the robot Robot can provide intuitive cues for the human Requires matching to our social interface

Collaborative Manipulation Social amplification

Collaborative Manipulation A third arm: Hold a flashlight Fixture a part Extend our physical abilities: Carry groceries Open a jar Expand our workspace: Place dishes in a cabinet Hand a tool Reach a shelf

Task Relevant Features What is important? What is irrelevant? *Distinct from object detection/recognition.

Structure In Human Environments Donald Norman The Design of Everyday Objects

Structure In Human Environments Human environments are constrained to match our cognitive and physical abilities Sense from above Flat surfaces Objects for human hands Objects for use by humans

Why are tool tips common? Single, localized interface to the world Physical isolation helps avoid irrelevant contact Helps perception Helps control

Tool Tip Detection Visual + motor detection method Kinematic Estimate Visual Model

Mean Pixel Error for Automatic and Hand Labelled Tip Detection

Mean Pixel Error for Hand Labeled, Multi-Scale Detector, and Point Detector

Model-Free Insertion Active tip perception Arm stiffness modulation Human interaction

Other Examples Circular openings Handles Contact Surfaces Gravity Alignment

Future: Generalize What You've Learned Across objects Perceptually map tasks across objects Key features map to key features Across manipulators Motor equivalence Manipulator details may be irrelevant

RSS 2006 Workshop Manipulation for Human Environments Robotics: Science and Systems University of Pennsylvania , August 19th, 2006 manipulation.csail.mit.edu/rss06

Summary Importance of Task Relevant Features Example of the tool tip Large set of hand tools Robust detection (visual + motor) Kinematic estimate Visual model

In Progress Perform a variety of tasks Insertion Pouring Brushing

Learning from Demonstration

The Detector Responds To Fast Motion Convex

Video from Eye Camera Motion Weighted Edge Map Multi-scale Histogram (Medial-Axis, Hough Transform for Circles) Local Maxima

Defining Characteristics Geometric Isolated Distal Localized Convex Cultural/Design Far from natural grasp location Long distance relative to hand size

Other Task Relevant Features?

Detecting the Tip

Include Scale and Convexity

Distinct Perceptual Problem Not object recognition How should it be used Distinct methods and features

Use The Hand's Frame Combine weak evidence Rigidly grasped

Acquire a Visual Model