RSS 2011 Workshop on RGB-D Cameras

Slides:



Advertisements
Similar presentations
For Internal Use Only. © CT T IN EM. All rights reserved. 3D Reconstruction Using Aerial Images A Dense Structure from Motion pipeline Ramakrishna Vedantam.
Advertisements

Real-Time Dynamic Wrinkles Caroline Larboulette Marie-Paule Cani GRAVIR Lab, Grenoble, France.
Junjie Cao 1, Andrea Tagliasacchi 2, Matt Olson 2, Hao Zhang 2, Zhixun Su 1 1 Dalian University of Technology 2 Simon Fraser University Point Cloud Skeletons.
KinectFusion: Real-Time Dense Surface Mapping and Tracking
Real-time, low-resource corridor reconstruction using a single consumer grade RGB camera is a powerful tool for allowing a fast, inexpensive solution to.
Hilal Tayara ADVANCED INTELLIGENT ROBOTICS 1 Depth Camera Based Indoor Mobile Robot Localization and Navigation.
3DSkeleton-based Human Modeling with Metaballs 18 April 2008 Donghun Kim Robot Vision Lab.
VIRTUAL ARTHROSCOPIC KNEE SURGERY TRANING SYSTEM Yang Xiaosong The Chinese University of Hong Kong Tsinghua University.
Eurohaptics 2002 © Interactive Haptic Display of Deformable Surfaces Based on the Medial Axis Transform Jason J. Corso, Jatin Chhugani,
Iterative Relaxation of Constraints (IRC) Can’t solve originalCan solve relaxed PRMs sample randomly but… start goal C-obst difficult to sample points.
Proximity Computations between Noisy Point Clouds using Robust Classification 1 Jia Pan, 2 Sachin Chitta, 1 Dinesh Manocha 1 UNC Chapel Hill 2 Willow Garage.
Atomic Volumes for Mesh Completion Joshua Podolak Szymon Rusinkiewicz Princeton University.
High-Quality Simplification with Generalized Pair Contractions Pavel Borodin,* Stefan Gumhold, # Michael Guthe,* Reinhard Klein* *University of Bonn, Germany.
Haptic Rendering using Simplification Comp259 Sung-Eui Yoon.
Shape from Contours and Multiple Stereo A Hierarchical, Mesh-Based Approach Hendrik Kück, Wolfgang Heidrich, Christian Vogelgsang.
Constructing immersive virtual space for HAI with photos Shingo Mori Yoshimasa Ohmoto Toyoaki Nishida Graduate School of Informatics Kyoto University GrC2011.
Asst. Prof. Yusuf Sahillioğlu
Pauly, Keiser, Kobbelt, Gross: Shape Modeling with Point-Sampled GeometrySIGGRAPH 2003 Shape Modeling with Point-Sampled Geometry Mark Pauly Richard Keiser.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Haptic Rendering Max Smolens COMP 259 March 26, 2003.
Providing Haptic ‘Hints’ to Automatic Motion Planners Providing Haptic ‘Hints’ to Automatic Motion Planners Burchan Bayazit Joint Work With Nancy Amato.
Implicit Surfaces Tom Ouyang January 29, Outline Properties of Implicit Surfaces Polygonization Ways of generating implicit surfaces Applications.
1cs426-winter-2008 Notes  Course project: Will be due at the end of the term You can do it in pairs Create an animation that combines an algorithm for.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Constraint-Based Motion Planning using Voronoi Diagrams Maxim Garber and Ming C. Lin Department of Computer.
Spectral Processing of Point-sampled Geometry
Constructing immersive virtual space for HAI with photos Shingo Mori Yoshimasa Ohmoto Toyoaki Nishida Graduate School of Informatics Kyoto University GrC2011.
3D object capture Capture N “views” (parts of the object) –get points on surface of object –create mesh (infer connectivity) Hugues Hoppe –filter data.
Providing Haptic ‘Hints’ to Automatic Motion Planners Providing Haptic ‘Hints’ to Automatic Motion Planners by Burchan Bayazit Department of Computer Science.
CSE 681 Ray Tracing Implicit Surfaces. CSE 681 Overview Similar to CSG –Combine primitive objects to form complex object Primitives are “density fields”
Multimedia Systems & Interfaces Karrie G. Karahalios Spring 2007.
Assembly Simulation on Collaborative Haptic Virtual Environments Rosa Iglesias, Elisa Prada Sara Casado, Teresa Gutierrez Ainhoa Uribe, Alejandro Garcia-Alonso.
Gwangju Institute of Science and Technology Intelligent Design and Graphics Laboratory Multi-scale tensor voting for feature extraction from unstructured.
Graphics Graphics Korea University cgvr.korea.ac.kr Creating Virtual World I 김 창 헌 Department of Computer Science Korea University
3D Fingertip and Palm Tracking in Depth Image Sequences
Haptic rendering Part 1 4/CY/B3 Part 1. Collision detection and response Nic Melder Part 2. Manipulation and polygon transitions (T.B.A.)
KinectFusion : Real-Time Dense Surface Mapping and Tracking IEEE International Symposium on Mixed and Augmented Reality 2011 Science and Technology Proceedings.
Dynamic Meshing Using Adaptively Sampled Distance Fields
Accelerating Ray Tracing using Constrained Tetrahedralizations Ares Lagae & Philip Dutré 19 th Eurographics Symposium on Rendering EGSR 2008Wednesday,
2D/3D Shape Manipulation, 3D Printing Shape Representations Slides from Olga Sorkine February 20, 2013 CS 6501.
MESA LAB Two papers in icfda14 Guimei Zhang MESA LAB MESA (Mechatronics, Embedded Systems and Automation) LAB School of Engineering, University of California,
 Supervised by Prof. LYU Rung Tsong Michael Student: Chan Wai Yeung ( ) Lai Tai Shing ( )
Haptic Interfaces for Virtual Reality and Teleoperation.
Global Parametrization of Range Image Sets Nico Pietroni, Marco Tarini, Olga Sorkine, Denis Zorin.
Procedural Haptic Texture Jeremy Shopf Marc Olano University of Maryland, Baltimore County.
Richard Kelley Motion Planning on a GPU. Last Time Nvidia’s white paper Productive discussion.
Haptic Rendering Part 2 4/CY/B3 Part 1. Collision detection and response Nic Melder Part 2. Manipulation and polygon transitions Nic Melder.
A Computationally Efficient Framework for Modeling Soft Body Impact Sarah F. Frisken and Ronald N. Perry Mitsubishi Electric Research Laboratories.
The 18th Meeting on Image Recognition and Understanding 2015/7/29 Depth Image Enhancement Using Local Tangent Plane Approximations Kiyoshi MatsuoYoshimitsu.
Peter Henry1, Michael Krainin1, Evan Herbst1,
Efficient Streaming of 3D Scenes with Complex Geometry and Complex Lighting Romain Pacanowski and M. Raynaud X. Granier P. Reuter C. Schlick P. Poulin.
M. Zareinejad.  Use haptic device to physically interact with the VE – optical encoders measure position of end effector – actuators apply forces.
1 1 Spatialized Haptic Rendering: Providing Impact Position Information in 6DOF Haptic Simulations Using Vibrations 9/12/2008 Jean Sreng, Anatole Lécuyer,
High Resolution Surface Reconstruction from Overlapping Multiple-Views
2006/10/25 1 A Virtual Endoscopy System Author : Author : Anna Vilanova 、 Andreas K ö nig 、 Eduard Gr ö ller Source :Machine Graphics and Vision, 8(3),
Multi Scale CRF Based RGB-D Image Segmentation Using Inter Frames Potentials Taha Hamedani Robot Perception Lab Ferdowsi University of Mashhad The 2 nd.
PhD Defense Emanuele Ruffaldi
MASKS © 2004 Invitation to 3D vision. MASKS © 2004 Invitation to 3D vision Lecture 1 Overview and Introduction.
A Fast Algorithm for Incremental Distance Calculation Ming C. Lin & John Canny University of California, Berkeley 1991 Presentation by Adit Koolwal.
HAPTEX-Meeting Tampere, Feb , 2006 Haptic Rendering / Small Scale Model Guido Böttcher haptex.miralab.unige.ch Funded by: FET-IST-FP6 (IST-6549)
CIVET seminar Presentation day: Presenter : Park, GilSoon.
Filters– Chapter 6. Filter Difference between a Filter and a Point Operation is that a Filter utilizes a neighborhood of pixels from the input image to.
11/25/03 3D Model Acquisition by Tracking 2D Wireframes Presenter: Jing Han Shiau M. Brown, T. Drummond and R. Cipolla Department of Engineering University.
Introduction to Computer Haptics Chris Harding
1 Haptic Systems Mohsen Mahvash Lecture 6 17/1/06.
Signal and Image Processing Lab
Andreas Hermann, Felix Mauch, Sebastian Klemm, Arne Roennau
Levelsets in Production : Spider-Man 3
Range Image Segmentation for Modeling and Object Detection in Urban Scenes Cecilia Chen & Ioannis Stamos Computer Science Department Graduate Center, Hunter.
Real Time Dense 3D Reconstructions: KinectFusion (2011) and Fusion4D (2016) Eleanor Tursman.
CAPTURING OF MOVEMENT DURING MUSIC PERFORMANCE
Overview of Modeling 김성남.
Presentation transcript:

RSS 2011 Workshop on RGB-D Cameras A Constraint-Based Method for 3-DOF Haptic Rendering of Arbitrary Point Cloud Data Adam Leeper Sonny Chan Kenneth Salisbury 1

Overview Motivation Part One: Haptic Algorithm Part Two: Real-Time Strategies Results

Force feedback for teleoperation is costly to measure. Motivation Force feedback for teleoperation is costly to measure. We can use a model to estimate interaction forces. Remote sensors produce 3D points. 3

Potential Fields and Penalty Methods Haptic force is computed from current HIP position. Force is proportional to penetration depth. Geometric Shapes Infinite Wall F = -k*x x

Potential Fields and Penalty Methods Haptic force is computed from current HIP position. Force is proportional to penetration depth. “pop-through” when objects are thin

Constraint-Based Methods A proxy / god-object is constrained to the surface. A virtual spring connects proxy to HIP.

El-Far, Georganas, El Saddick. 2008. Some Previous Methods Cha, Eid, Saddik. EuroHaptics 2008. Depth-image tessellation. Proxy mesh algorithm. El-Far, Georganas, El Saddick. 2008. Per-point AABB collision detection. Proxy constrained to discrete point locations.

Constraint-Based Methods Constraint method works well for implicit functions Salisbury & Tarr, 1997 f < 0 f > 0 http://xrt.wikidot.com/

Constraint-Based Methods A constraint-plane is given by the surface point and normal.

From Points to an Implicit Surface Great. So how do we make an implicit surface from points? First, we’ll give each point a compact weighting function. Then we have two options: Metaballs: constructive geometry Surfels: surface estimation

From Points to an Implicit Surface Metaballs Each point produces a 3D scalar field f(x,y,z). The net scalar field is simply the sum of all points. A threshold value, T, is chosen to define an isosurface on this field. T = 0.6 T = 0.2

From Points to an Implicit Surface Metaballs Each point produces a 3D scalar field f (x,y,z). The net scalar field is simply the sum of all points. A threshold value, T, is chosen to define an isosurface on this field. Don’t need point normals! T = 0.2 T = 0.6

From Points to an Implicit Surface Metaballs Each point produces a 3D scalar field f (x,y,z). The net scalar field is simply the sum of all points. A threshold value, T, is chosen to define an isosurface on this field.

From Points to an Implicit Surface Surfels Local surface estimation (Adamson and Alexa 2003) weighted-average point position weighted-average point normal

Auto-generated parameters adapt easily to any input cloud! Selecting Parameters Auto-generated parameters adapt easily to any input cloud! For each point: Compute the average distance, d, to the nearest N=3 neighbors. Set R to some multiple m of d. Generally m ~= 2. For metaball rendering, T = 0.5 – 0.8 works for most data.

Part Two: Real-Time Strategies Fast Collision Detection Spatial Issues Temporal Issues

Fast Collision Detection Haptic servo loop is typically 1kHz, can’t use all points! Points have only compact support of radius R. A kd-tree or octree provides fast radius and kNN searches.

Use a voxel-grid filter to down-sample cloud. Spatial Issues 640x480 depth image = 300,000 points. Some are outside the workspace of the haptic device. Sensor quantization noise should be filtered. Our kinesthetic sense just isn’t that good. (Most) haptic devices just aren’t that good. Use a voxel-grid filter to down-sample cloud.

Temporal Issues Cloud pre-processing must not interfere with servo loop. Sensor noise feels like vibration, especially at edges. New Cloud Processing Cloud Update Thread Servo Thread . . . Cloud 0 Cloud 1 Cloud N Discarded We used N = 4.

Bonus: Multiple Point Sources This algorithm inherently handles multiple sensor clouds. The union of nearby points in each cloud is used for rendering. New Cloud Processing Cloud Update Thread New Cloud Servo Thread . . . Cloud 0 Cloud 1 Cloud N Discarded

Real-Time Performance: Results Metaballs: Only option for sparse or non-planar regions. Feels more wavy/knobbly in high noise regions. Surfels: Better spatial noise reduction for planar regions. Requires normal estimation. Real-Time Performance: Kinect updates at about 10Hz. Haptic loop time < 200us.

Conclusions Point clouds can be used to generate an implicit surface suitable for stable haptic rendering with no pop-through. Remote environments can be explored haptically with real- time updates from a 3D sensor. This strategy could be used to generate feedback constraint forces for robot teleoperation in a remote environment.

? Acknowledgments Thanks to colleagues Reuben Brewer, Gunter Niemeyer. National Science Foundation NSERC of Canada  Come try the demo this afternoon! ?