Implementing Tactile Behaviors Using FingerVision

Slides:



Advertisements
Similar presentations
A Natural Interactive Game By Zak Wilson. Background This project was my second year group project at University and I have chosen it to present as it.
Advertisements

Working for the future - today
1/12 Vision based rock, paper, scissors game Patrik Malm Standa Mikeš József Németh István Vincze.
Joshua Fabian Tyler Young James C. Peyton Jones Garrett M. Clayton Integrating the Microsoft Kinect With Simulink: Real-Time Object Tracking Example (
No soda cans were harmed in the production of this project RAT Soda can detection system What is it RAT is an attempt to create a system to detect any.
David Wild Supervisor: James Connan Rhodes University Computer Science Department Gaze Tracking Using A Webcamera.
Vision Based Control Motion Matt Baker Kevin VanDyke.
Facial feature localization Presented by: Harvest Jang Spring 2002.
Foreground Modeling The Shape of Things that Came Nathan Jacobs Advisor: Robert Pless Computer Science Washington University in St. Louis.
Bryan Willimon Master’s Thesis Defense Interactive Perception for Cluttered Environments.
A KLT-Based Approach for Occlusion Handling in Human Tracking Chenyuan Zhang, Jiu Xu, Axel Beaugendre and Satoshi Goto 2012 Picture Coding Symposium.
Move With Me S.W Graduation Project An Najah National University Engineering Faculty Computer Engineering Department Supervisor : Dr. Raed Al-Qadi Ghada.
Lecture 5 Template matching
HCI Final Project Robust Real Time Face Detection Paul Viola, Michael Jones, Robust Real-Time Face Detetion, International Journal of Computer Vision,
Preprocessing ROI Image Geometry
[cvPONG] A 3-D Pong Game Controlled Using Computer Vision Techniques Quan Yu and Chris Wagner.
Geometric Probing with Light Beacons on Multiple Mobile Robots Sarah Bergbreiter CS287 Project Presentation May 1, 2002.
Real-Time Face Detection and Tracking Using Multiple Cameras RIT Computer Engineering Senior Design Project John RuppertJustin HnatowJared Holsopple This.
ICBV Course Final Project Arik Krol Aviad Pinkovezky.
Optimal Placement and Selection of Camera Network Nodes for Target Localization A. O. Ercan, D. B. Yang, A. El Gamal and L. J. Guibas Stanford University.
Face Detection using the Viola-Jones Method
Information Extraction from Cricket Videos Syed Ahsan Ishtiaque Kumar Srijan.
Olga Zoidi, Anastasios Tefas, Member, IEEE Ioannis Pitas, Fellow, IEEE
Group #2 / Embedded Motion Control [5HC99] Embedded Visual Control 1 Group #5 / Embedded Visual Control Self-Balancing Robot Navigation Paul Padila Vivian.
Vision Surveillance Paul Scovanner.
Background Subtraction for Urban Traffic Monitoring using Webcams Master Graduation Project Progress Presentation Supervisor: Rein van den Boomgaard Mark.
T-800 Vision by the Terminator Jonathan Russo, Asaf Shamir, Baruch Segal.
Graduate Programs in Computer Science A Soft Hand Model for Physically-based Manipulation of Virtual Objects Jan Jacobs Group Research.
The Eos-Explorer CHENRAN YE IMDE ECE 4665/5666 Fall 2011.
Hand Tracking for Virtual Object Manipulation
3-Point Bending Device to Measure Transmural Strains for Multilayer Soft Tissue Composite Week 1 Weekly Report Jennifer Olson Sarah Rivest Brian Schmidtberg.
ENTERFACE 08 Project 2 “multimodal high-level data integration” Mid-term presentation August 19th, 2008.
Bryan Willimon, Stan Birchfield, Ian Walker Department of Electrical and Computer Engineering Clemson University IROS 2010 Rigid and Non-Rigid Classification.
1 Artificial Intelligence: Vision Stages of analysis Low level vision Surfaces and distance Object Matching.
Figure ground segregation in video via averaging and color distribution Introduction to Computational and Biological Vision 2013 Dror Zenati.
Sensor B Sensor A Sensor C Sensor D Sensor E Lightweight Mining Techniques Time Frame: 10 Time Threshold: 20.
COMP322/S2000/L281 Task Planning Three types of planning: l Gross Motion Planning concerns objects being moved from point A to point B without problems,
Introduction to Image Processing Our first look at image processing will be through the use of Paint Shop Pro, a bitmap editing program available as shareware.
Learning to Detect Faces A Large-Scale Application of Machine Learning (This material is not in the text: for further information see the paper by P.
Univ logo Research and Teaching using a Hydraulically-Actuated Nuclear Decommissioning Robot Craig West Supervisors: C. J. Taylor, S. Monk, A. Montazeri.
Final Year Project. Project Title Kalman Tracking For Image Processing Applications.
Visual Tracking by Cluster Analysis Arthur Pece Department of Computer Science University of Copenhagen
Target Tracking In a Scene By Saurabh Mahajan Supervisor Dr. R. Srivastava B.E. Project.
RoboCup KSL Design and implementation of vision and image processing core Academic Supervisor: Dr. Kolberg Eli Mentors: Dr. Abramov Benjamin & Mr.
Vision & Image Processing for RoboCup KSL League Rami Isachar Lihen Sternfled.
Over the recent years, computer vision has started to play a significant role in the Human Computer Interaction (HCI). With efficient object tracking.
SPACE MOUSE. INTRODUCTION  It is a human computer interaction technology  Helps in movement of manipulator in 6 degree of freedom * 3 translation degree.
SEMINAR ON TRAFFIC MANAGEMENT USING IMAGE PROCESSING by Smruti Ranjan Mishra (1AY07IS072) Under the guidance of Prof Mahesh G. Acharya Institute Of Technology.
Instantaneous Geo-location of Multiple Targets from Monocular Airborne Video.
Content Based Coding of Face Images
Day 3: computer vision.
Hand Gestures Based Applications
Touching With Vision: Optical Tactile Sensing & Whole Body Vision
Daniel A. Taylor Pitt- Bradford University
Vision Basics Lighting I. Vision Basics Lighting I.
Tactile Rendering - Strategies
Contents Team introduction Project Introduction Applicability
Game Theoretic Image Segmentation
CS b659: Intelligent Robotics
CS201 Lecture 02 Computer Vision: Image Formation and Basic Techniques
Game Strategy for APC 2016.
Identifying Human-Object Interaction in Range and Video Data
Image processing and computer vision
A visual surveillance using real-time curve evolution based on the level-set method and pan-tilt camera Good afternoon ~ sir. Today I want to talk about.
Special Sensor CMU Camera 3
Elecbits Electronic shade.
Vision Based UAV Landing
Unsupervised Perceptual Rewards For Imitation Learning
Experimental setup for workspace, bandwidth, and force characterization of the milliDelta. Experimental setup for workspace, bandwidth, and force characterization.
Morphological Filters Applications and Extension Morphological Filters
Presentation transcript:

Implementing Tactile Behaviors Using FingerVision Akihiko Yamaguchi and Christopher G. Atkeson Presentation* by Jacob Best *Special thanks to Google Slides “Science Fair” Template

Outline What is FingerVision? How does it work? Grasping Strategies Experiments

What is FingerVision?

FingerVision FingerVision is a vision-based tactile sensor consisting of soft skin and cameras

FingerVision Soft layer, made of silicone, has markers allowing the cameras to detect skin deformation Transparency of layers allows cameras to calculate and track proximity objects in the real world It is cheap! ($50/sensor) and easy! (all calculations done with OpenCV) Y X Z

How does it work? i.e. how does it actually perform grasping tasks?

Multimodality Two Basic Functions of FingerVision: Proximity Vision - detection of the object being grasped and its movement through images Object shape/texture Slip Movement Force Distribution - tracking the markers on the soft layer to understand the force of the grip in x, y, and z dimensions

Proximity Vision All images represented as histograms of colors Movement Detection Uses cv::BackgroundSubstractorMOG2 Object Model Construction Originally constructs a model of the background Before gripping new object, uses movement detection to model image Erosion and dilation to refine object Tracking On new frame, apply back projection method (cv::calcBackProject) on histogram of object Then use background subtraction again

Force Distribution Two Methods for Object Tracking: cv::MeanShift cv::SimpleBlobDetector Two Stages for Object Tracking: Calibration - Find markers by placing image against white background Tracking - Follow blobs by selecting ROI around each blob Calculation of Force: Right: MeanShift Left: SimpleBlobDetector When normal force is applied:

Grasping Strategies Types of grasps and how to utilize Vision and Force to accomplish these tasks

Types of Grasps Gentle Grasp Holding Handover Uses array of force and some threshold for the sum of the norm of the forces to know when to stop gripping Holding Increases grip on the object when there is slippage Slip defined as number of moving points on object > threshold Handover Open gripper when there is a certain change in force or slip In-Hand Manipulation (controlled slip) Slowly decrease grip until slip is detected then increase grip to a hold Repeat that process until target angle/orientation of object is achieved

Experiments Types of grasps and how to utilize Vision and Force to accomplish these tasks

Pouring Water into A Grasped Container

Gentle Grasp: Coke Can + Business Card

Holding

Holding for Lightweight Objects: Oragami Bird

Handover

In-Hand Manipulation

Conclusion The manipulation strategies were effective with FingerVision (Though In-Hand Manipulation not as effective) Very impressive and effective grasping technique in general Creative but computationally and conceptually simple Next steps? Would be interested to see how it performs on heavier and more irregularly shaped objects Practicality? Gentle grip has potential for complicated robot tasks

Video