Presentation is loading. Please wait.

Presentation is loading. Please wait.

1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.

Similar presentations


Presentation on theme: "1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception."— Presentation transcript:

1 1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception and Robotic Manipulation Springer Tracts in Advanced Robotics Chapter 3 Shape Recovery Using Robust Light Striping Geoffrey Taylor Lindsay Kleeman

2 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 2 Contents Motivation for stereoscopic stripe ranging. Benefits of our scanner. Validation/reconstruction framework. Image-based calibration technique. Experimental results. Conclusions and future work.

3 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 3 Motivation Allow robot to model and locate objects in the environment as first step in manipulation. Capture registered colour and depth data to aid intelligent decision making.

4 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 4 Conventional Scanner Triangulate from two independent measurements: image plane data and laser stripe position. Depth image constructed by sweeping stripe. Camera Image sweep stripe Stripe generator Camera Scanned object B D

5 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 5 Difficulties Light stripe assumed to be the brightest feature: –Objects specially prepared (matte white paint) –Scans performed in low ambient light –Use high contrast camera (can’t capture colour) Noise sources invalidate brightness assumption: –Specular reflections –Cross talk between robots –Stripe-like textures in the environment For service robots, we need a robust scanner that does not rely on brightness assumption!

6 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 6 Related Work Robust scanners must validate stripe measurements Robust single camera scanners: –Validation from motion: Nygårds et al, 1994 –Validation from modulation: Haverinen et al,1998 –Two intersecting stripes: Nakano et al, 1988 Robust stereo camera scanners: –Independent sensors: Trucco et al, 1994 –Known scene structure: Magee et al, 1994 Existing methods suffer from assumed scene structure, acquisition delay, lack of error recovery

7 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 7 Stereo Scanner Stereoscopic light striping approach: –Validation through three redundant measurements: Measured stripe location on stereo image planes Known angle of light plane –Validation/reconstruction constraint: There must be some point on the known light plane that projects to the stereo measurements (within a threshold error) for these measurements to be valid. –Reconstructed point is optimal with respect to measurement noise (uniform image plane error) –System parameters can be calibrated from scan of an arbitrary non-planar target

8 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 8 Validation/Reconstruction Left Camera L P Scanned surface Right Camera R P Right Image Plane Left Image Plane Laser plane at known position and angle Unknown 3D reconstruction (constrained to light plane) LxLx X RxRx Right measurement Left measurement  Light plane params

9 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 9 Validation/Reconstruction Unknown reconstruction projects to measurements: L x = L PX, R x = R PX Find reconstruction X that minimizes image error: E = d 2 ( L x, L PX) + d 2 ( R x, R PX) –Subject to constraint  T X = 0 ie. X is on laser plane –d 2 (a, b) is Euclidean distance between points a and b If E < E th then ( L x, R x) are valid measurements and X is the optimal reconstruction The above constrained optimization has analytical solution in the case of rectilinear stereo

10 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 10 Calibration System params: p = (k 1, k 2, k 3,  x,  z, B 0, m, c) –k 1, k 2, k 3 relate to laser position and camera baseline –  x,  z, B 0 relate to plane orientation –m, c relate laser encoder counts e to angle,  x = me + c Take scan of arbitrary non-planar scene –Initially assume laser is given by brightest feature Form total reconstruction error E tot over all points: E tot =  j=frames  i=scanlines E( L x ij, R x ij, e j, p)

11 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 11 Calibration Find p that minimizes total error (assuming L x ij, R x ij, e j are fixed): p* = min p [E tot (p)] –Use Levenberg-Marquardt numerical minimization Refinement steps: –Above solution will be inaccurate due to incorrect correspondences caused by brightness assumption –Use initial p* to validate ( L x ij, R x ij ) and reject invalid measurements, then recalculate p* –Above solution also assumes no error in encoder counts e j. Error removed by iterative refinement of e j and p*.

12 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 12 Implementation Right Camera Left Camera Optical Encoder Laser Stripe

13 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 13 Image processing Left field Right field Image difference Extract maxima Left candidates Right candidates Raw stereo image When multiple candidates appear on each scan line, calculate reconstruction error E for every candidate pair and choose pair with smallest E < E th

14 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 14 Scanning Process

15 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 15 Results Refection/cross-talk mirror experiment: mirror generates bright false stripe measurements

16 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 16 Results Laser extracted as brightest feature per line All bright candidates extracted and matched using validation condition.

17 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 17 More Results Office phone mirror scan results: Brightest feature without Validation With Validation

18 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 18 Specular Objects Tin can scan with specular reflections

19 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 19 Specular Objects Tin can scan results: Brightest feature without Validation With Validation

20 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 20 Depth Discontinuities Depth discontinuities cause association ambiguity

21 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 21 Depth Discontinuities Depth discontinuity scan results: Without ValidationWith Validation

22 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 22 Conclusions We have developed a mechanism for eliminating: –sensor noise –cross talk –‘ghost’ stripes (reflections, striped textures, etc) Developed an image based calibration technique requiring an arbitrary non-planar target. Operation in ambient light allows registered range and colour to be captured in a single sensor. Experimental results validate the above techniques.

23 Taylor and Kleeman, Visual Perception and Robotic Manipulation, Springer Tracts in Advanced Robotics 23 Future Directions Use multiple simultaneous stripes to increase acquisition rate –Multiple stripes can be disambiguated using the same framework that provides validation Perform object segmentation, modeling, and recognition on scanned data to support grasp planning and manipulation on a service robot.


Download ppt "1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception."

Similar presentations


Ads by Google