Chayatat Ratanasawanya Min He April 6, 2010.  Recall previous presentation  The goal  Progress report ◦ Image processing ◦ depth estimation ◦ Camera.

Slides:



Advertisements
Similar presentations
Simulator of a SCARA Robot with LABVIEW
Advertisements

Solar Sail Attitude Control using a Combination of a Feedforward and a Feedback Controller D. Romagnoli, T. Oehlschlägel.
Chayatat Ratanasawanya Min He May 13, Background information The goal Tasks involved in implementation Depth estimation Pitch & yaw correction angle.
Visual Servo Control Tutorial Part 1: Basic Approaches Chayatat Ratanasawanya December 2, 2009 Ref: Article by Francois Chaumette & Seth Hutchinson.
Image Processing and Computer Vision Chapter 10: Pose estimation by the iterative method (restart at week 10) Pose estimation V4h31.
Pose Estimation Using Four Corresponding Points M.L. Liu and K.H. Wong, "Pose Estimation using Four Corresponding Points", Pattern Recognition Letters,
Qball-X4 Simulator Seang Cau February 16, 2011.
Intellectual Property Rights are governed by PEGASE Contract Annex II Part C and PEGASE consortium agreements. Before using, reproducing, modifying or.
Xbxb dbdb dtdt γ nvnv θ xtxt npnp hphp ngng α H f ground plane image plane (inverse) gravity ground plane orientation ground plane height object vertical.
GETTING THE RIGHT AMOUNT OF LIGHT TO MAKE THE PICTURE.
Scaled Helicopter Mathematical Model and Hovering Controller Brajtman Michal & Sharabani Yaki Supervisor : Dr. Rotstein Hector.
Hybrid Position-Based Visual Servoing
UAV pose estimation using POSIT algorithm
Image Correspondence and Depth Recovery Gene Wang 4/26/2011.
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
CH24 in Robotics Handbook Presented by Wen Li Ph.D. student Texas A&M University.
Physics 52 - Heat and Optics Dr. Joseph F. Becker Physics Department San Jose State University © 2003 J. F. Becker San Jose State University Physics 52.
Model Independent Visual Servoing CMPUT 610 Literature Reading Presentation Zhen Deng.
Vision-Based Motion Control of Robots
Stereoscopic Light Stripe Scanning: Interference Rejection, Error Minimization and Calibration By: Geoffrey Taylor Lindsay Kleeman Presented by: Ali Agha.
MEAM 620 Project Report Nima Moshtagh.
COMP322/S2000/L221 Relationship between part, camera, and robot (cont’d) the inverse perspective transformation which is dependent on the focal length.
Lecture 11: Structure from motion CS6670: Computer Vision Noah Snavely.
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
Image-based Control Convergence issues CMPUT 610 Winter 2001 Martin Jagersand.
Image-Based Rendering using Hardware Accelerated Dynamic Textures Keith Yerex Dana Cobzas Martin Jagersand.
COMP322/S2000/L23/L24/L251 Camera Calibration The most general case is that we have no knowledge of the camera parameters, i.e., its orientation, position,
Properties of Lenses Dr. Kenneth Hoffman. Focal Length The distance from the optical center of a lens to the film plane (imaging chip) when the lens is.
Lecture 12: Structure from motion CS6670: Computer Vision Noah Snavely.
Journey Through a Camera
Optimal Placement and Selection of Camera Network Nodes for Target Localization A. O. Ercan, D. B. Yang, A. El Gamal and L. J. Guibas Stanford University.
1 DARPA TMR Program Collaborative Mobile Robots for High-Risk Urban Missions Second Quarterly IPR Meeting January 13, 1999 P. I.s: Leonidas J. Guibas and.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
오 세 영, 이 진 수 전자전기공학과, 뇌연구센터 포항공과대학교
Cameron Elliott Garabed Tashian Jeff Crispo.  In North America, legal blindness is defined as a visual acuity of 20/200  39 million are blind worldwide.
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Newton's Method for Functions of Several Variables Joe Castle & Megan Grywalski.
Columbia GraspIt!: A Versatile Simulator for Robotic Grasping Andrew T. Miller Columbia University.
Introduction to Engineering Camera Lab #3 - 1 Agenda Do parts I and II of the lab Record data Answer questions.
MARS Design Review PP-1 Requirements Definition Performance: –Tip/Tilt error < 0.06″ rms mirror coordinates 0.12″ rms image coodinates 0.28″ fwhm image.
CS348B Lecture 7Pat Hanrahan, 2005 Camera Simulation EffectCause Field of viewFilm size, stops and pupils Depth of field Aperture, focal length Motion.
Ch. 3: Geometric Camera Calibration
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
0 Assignment 1 (Due: 3/9) The projections of two parallel lines, l 1 and l 2, which lie on the ground plane G, onto the image plane I converge at a point.
Dynamic Models of the Draganflyer
Error Introduced by: Image Analysis Software Camera & Part Holder “..any sufficiently advanced technology is indistinguishable from magic.” Arthur C. Clark.
June 10, 2013 Presented by Joshua Vallecillos Supervised By Christine Wittich, Ph.D. Student, Structural Engineering.
1 Imaging Techniques for Flow and Motion Measurement Lecture 20 Lichuan Gui University of Mississippi 2011 Stereo High-speed Motion Tracking.
Existing Draganflyer Projects and Flight Model Simulator Demo Chayatat Ratanasawanya 5 February 2009.
V ISION -B ASED T RACKING OF A M OVING O BJECT BY A 2 DOF H ELICOPTER M ODEL : T HE S IMULATION Chayatat Ratanasawanya October 30, 2009.
Chayatat Ratanasawanya May 18, Overview Recalls Progress & Achievement Results 2.
Chapter 2: The Lens. Focal Length is the distance between the center of a lens and the film plane when focused at infinity.
1 WRF-EnKF Lightning Assimilation Real-Observation Experiments Overview Cliff Mass, Greg Hakim, Phil Regulski Department of Atmospheric Sciences University.
Distance Estimation Ohad Eliyahoo And Ori Zakin. Introduction Current range estimation techniques require use of an active device such as a laser or radar.
Arizona’s First University. Command and Control Wind Tunnel Simulated Camera Design Jacob Gulotta.
Using Sensor Data Effectively
Automated Spotsize Measurements
Zaid H. Rashid Supervisor Dr. Hassan M. Alwan
Image Processing for Physical Data
Overview Pin-hole model From 3D to 2D Camera projection
Aperture & Depth of Field
Lab 10: Lenses Focal Length Magnification Lens Equation Depth of Field
Physics-based simulation for visual computing applications
Visual Tracking on an Autonomous Self-contained Humanoid Robot
The focal length of a lens
Depth Of Field (DOF).
Distributed Ray Tracing
The Image The pixels in the image The mask The resulting image 255 X
Dynamic Modeling PDR Dynamic Modeling Preliminary Design Review for Vehicle and Avionics October 17, 2000 Presented By: Christopher Peters …and that’s.
Presentation transcript:

Chayatat Ratanasawanya Min He April 6, 2010

 Recall previous presentation  The goal  Progress report ◦ Image processing ◦ depth estimation ◦ Camera placement  Obstacles ◦ Combine image processing and control Simulink models  Idea for the next step  Questions/Comments

 4 visual-servoing structures ◦ Dynamic position-based look-and-move ◦ Dynamic image-based look-and-move ◦ Position-based visual servoing (PBVS) ◦ Image-based visual servoing (IBVS)  Implemented a simulation of the dynamic position-based look-and-move system.  Implemented a Simulink model to locate the centroid of a ping-pong ball in image.

 Implement the system using PBVS, IBVS, or both techniques.  Tasks to tackle: ◦ Image processing ◦ Depth estimation ◦ Camera placement on the helicopter model ◦ Combine image processing and control Simulink models. ◦ Jacobian matrix derivation

 Use the diameter of the ball in image to estimate the depth Depth, D Focal length F=538 pixel Center of projection Actual ball diameter d b =40mm Ball diameter on image, d img

area dm real dis Cal dis

 Combining our image processing model and the control model has been a challenging task. Global variable

exex eyey Increment in pitch and yaw angles LQR controller

 Questions/comments are welcome