PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace.

Slides:



Advertisements
Similar presentations
Single-view geometry Odilon Redon, Cyclops, 1914.
Advertisements

Computer Vision, Robert Pless
3D Computer Vision and Video Computing 3D Vision Lecture 3 - Part 1 Camera Models CS80000 Section 2 Spring 2005 Professor Zhigang Zhu, Rm 4439
Verification of specifications and aptitude for short-range applications of the Kinect v2 depth sensor Cecilia Chen, Cornell University Lewis’ Educational.
Manufacturing Automation
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Computer vision: models, learning and inference
Surveillance and Security
3D Vision Topic 1 of Part II Camera Models CSC I6716 Fall 2010
Active Calibration of Cameras: Theory and Implementation Anup Basu Sung Huh CPSC 643 Individual Presentation II March 4 th,
MASKS © 2004 Invitation to 3D vision Lecture 11 Vision-based Landing of an Unmanned Air Vehicle.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
1 Robust Video Stabilization Based on Particle Filter Tracking of Projected Camera Motion (IEEE 2009) Junlan Yang University of Illinois,Chicago.
CS485/685 Computer Vision Prof. George Bebis
Uncalibrated Epipolar - Calibration
Passive Object Tracking from Stereo Vision Michael H. Rosenthal May 1, 2000.
3D Computer Vision and Video Computing 3D Vision Lecture 14 Stereo Vision (I) CSC 59866CD Fall 2004 Zhigang Zhu, NAC 8/203A
Projected image of a cube. Classical Calibration.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
CS223b, Jana Kosecka Rigid Body Motion and Image Formation.
Camera Calibration CS485/685 Computer Vision Prof. Bebis.
CSE473/573 – Stereo Correspondence
Camera parameters Extrinisic parameters define location and orientation of camera reference frame with respect to world frame Intrinsic parameters define.
3-D Scene u u’u’ Study the mathematical relations between corresponding image points. “Corresponding” means originated from the same 3D point. Objective.
StarNav Advanced CMOS Star Trackers: enhanced accuracy, reliability and speed John L. Junkins Texas A&M University May 16, 2002.
Instance-level recognition I. - Camera geometry and image alignment Josef Sivic INRIA, WILLOW, ENS/INRIA/CNRS UMR 8548 Laboratoire.
Computer vision: models, learning and inference
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
WP3 - 3D reprojection Goal: reproject 2D ball positions from both cameras into 3D space Inputs: – 2D ball positions estimated by WP2 – 2D table positions.
Sérgio Ronaldo Barros dos Santos (ITA-Brazil) Sidney Nascimento Givigi Júnior (RMC-Canada) Cairo Lúcio Nascimento Júnior (ITA-Brazil) Autonomous Construction.
Camera Geometry and Calibration Thanks to Martial Hebert.
1 Preview At least two views are required to access the depth of a scene point and in turn to reconstruct scene structure Multiple views can be obtained.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alexander Norton Advisor: Dr. Huggins April 26, 2012 Senior Capstone Project Final Presentation.
High-Accuracy, Quick-Change, Robot Factory Interface
Course 12 Calibration. 1.Introduction In theoretic discussions, we have assumed: Camera is located at the origin of coordinate system of scene.
Geometric Models & Camera Calibration
Vision-based Landing of an Unmanned Air Vehicle
3D-2D registration Kazunori Umeda Chuo Univ., Japan CRV2010 Tutorial May 30, 2010.
Artificial Vision-Based Tele-Operation for Lunar Exploration Students Aaron Roney, Albert Soto, Brian Kuehner, David Taylor, Bonnie Stern, Nicholas Logan,
By: Alex Norton Advisor: Dr. Huggins November 15, 2011
Distortion Correction ECE 6276 Project Review Team 5: Basit Memon Foti Kacani Jason Haedt Jin Joo Lee Peter Karasev.
Image Restoration Juan Navarro Sorroche Phys-6314 Physics Department The University of Texas at Dallas Fall 2010 School of Natural Sciences & Mathematics.
Geometric Camera Models
Peripheral drift illusion. Multiple views Hartley and Zisserman Lowe stereo vision structure from motion optical flow.
Lecture 03 15/11/2011 Shai Avidan הבהרה : החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
High Resolution AMR Compass Honeywell Dr. Andy Peczalski Professor Beth Stadler Pat Albersman Jeff Aymond Dan Beckvall Marcus Ellson Patrick Hermans.
CSE 185 Introduction to Computer Vision Stereo. Taken at the same time or sequential in time stereo vision structure from motion optical flow Multiple.
CS-498 Computer Vision Week 7, Day 2 Camera Parameters Intrinsic Calibration  Linear  Radial Distortion (Extrinsic Calibration?) 1.
Towards Establishing and Maintaining Autonomous Quadrotor Formations Audrow J. Nash William Lee College of Engineering University of North Carolina at.
CSC508 Convolution Operators. CSC508 Convolution Arguably the most fundamental operation of computer vision It’s a neighborhood operator –Similar to the.
A Flexible New Technique for Camera Calibration Zhengyou Zhang Sung Huh CSPS 643 Individual Presentation 1 February 25,
Development of a laser slit system in LabView
Computer vision: models, learning and inference M Ahad Multiple Cameras
Speed Sensor Calibration
Single-view geometry Odilon Redon, Cyclops, 1914.
Image-Based Rendering Geometry and light interaction may be difficult and expensive to model –Think of how hard radiosity is –Imagine the complexity of.
Stereoscopic Imaging for Slow-Moving Autonomous Vehicle By: Alex Norton Advisor: Dr. Huggins February 28, 2012 Senior Project Progress Report Bradley University.
1 Teaching Innovation - Entrepreneurial - Global The Centre for Technology enabled Teaching & Learning, N Y S S, India DTEL DTEL (Department for Technology.
Camera calibration from multiple view of a 2D object, using a global non linear minimization method Computer Engineering YOO GWI HYEON.
3D Computer Vision 3D Vision Lecture 3 - Part 1 Camera Models Spring 2006 Zhang Aiwu.
Correspondence and Stereopsis. Introduction Disparity – Informally: difference between two pictures – Allows us to gain a strong sense of depth Stereopsis.
Roller Coaster Speed Sensor Calibration. Lab Safety  Do not stand on chairs, or sit or stand on the tables  Know the location of the first-aid kit 
Calibrating a single camera
Vision Based Motion Estimation for UAV Landing
Product Evaluation & Quality Improvement
Product Evaluation & Quality Improvement
Geometric Camera Models
Multiple View Geometry for Robotics
Project P06441: See Through Fog Imaging
Presentation transcript:

PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace Optical Navigation Adaptation (PSONA) Team Faculty Mentors: Ms. Magda Lagoudas, Dr. John L. Junkins, Dr. Raktim Bhattacharya

Overview PSONA Semester Goals PhaseSpace calibration (Albert Soto & Zach Sunberg) Outfitting quad-rotors with PhaseSpace/Vicon (Erin Mastenbrook) Quad-rotor custom electronics board design (Winnie Lung & Kate Stuckman) A look at the next semester 2

Semester Goals Characterizing and Calibrating PhaseSpace Characterize the PhaseSpace system Develop a method of calibration Evaluate the accuracy and reliability of PhaseSpace by comparing to Indoor GPS Compare all results to the capabilities of Vicon 3

Semester Goals Demonstrate the capability of utilizing PhaseSpace as a vision-based localization solution in multiple, UAV coordination Equip two DraganFlyer Quad-Rotors and LASR Lab to enable multi-vehicle autonomous flight – Next semester: develop user interfaces and control algorithms for autonomous flight UAV Demonstration 4

PhaseSpace Calibration 5 Error in the PhaseSpace system arises primarily from optical noise and from misalignments in the camera’s internal geometry (biases). By determining the true geometry of the camera, bias error may be compensated for, resulting in a more accurate “best guess” of the beacon location. Noise characterization then describes how reliable the “best guess” measurement is.

Approach 6 Develop a model that describes a camera’s output as a function of its intrinsic parameters and beacon position. Construct an algorithm to determine more accurate (or less biased) camera parameters. Using these improved parameters, run several tests to gather data reflecting the correlation of noise with beacon position in each dimension.

Mathematical Model 7 Extrinsic parameters: u – beacon location u 0 – mounting point Camera's Euler angles Intrinsic parameters: p z – pinhole depth s z – sensor center depth α – angle of sensor's axis from horizontal within ideal plane ß – angle between sensor and ideal plane p x' – distance from mount to pinhole along sensor's axis s x' – distance from mount to sensor center along sensor's axis pinhole

Mathematical Model 8 The following set of four equations describes an output pixel in terms of the basic intrinsic and extrinsic parameters. b 0, f, and δ are intermediate values. R is the rotation matrix containing the camera's Euler angles. ξ is the pixel value returned by the camera.

Calibration Simulation 9 Ideal CameraSensor 1Sensor 2 α (deg.)-4545 ß (deg.)00 sx' (in.)1.77 sz (in.).100 px' (in.)1.77 pz (in.)1.81 True CameraSensor 1Sensor 2 α (deg.) ß (deg.) sx' (in.) sz (in.) px' (in.) pz (in.) The ideal intrinsic parameter values shown below were passed into the GLSDC algorithm as initial guess values. Biased “true” values were used to generate output for eight beacon locations.

Calibration Simulation 10 Error (GLSDC)AverageMax α0.000% ß sx'0.000% sz0.000% px'0.000% pz0.000% Shown below is the % error for the ideal and GLSDC-derived cameras vs. the true camera. For this trial, noiseless measurements from the hypothetical true camera were used and the parameters were found accurately to several decimal places. Error (ideal)Sensor 1Sensor 2 α5.263%7.143% ß100.0% sx'2.210%1.724% sz4.762%5.263% px'0.568%1.667% pz0.556%1.630%

Calibration Simulation 11 Error (ideal)Sensor 1Sensor 2 α5.263%7.143% ß100.0% sx'2.210%1.724% sz4.762%5.263% px'0.568%1.667% pz0.556%1.630% Error (GLSDC)AverageMax α0.003%0.010% ß0.343%0.443% sx'0.043%0.101% sz2.896%4.740% px'0.045%0.100% pz0.160%0.273% Next, the noise was set to a value slightly greater than the maximum observed in testing. The average and maximum % errors from a set of 5 trials are shown. In all cases, the GLSDC code produced more accurate parameters, often by a large margin. Sensor depth was estimated less accurately than the other parameters.

Calibration Testing 12 Ideal CameraSensor 1Sensor 2 α (deg.)-4545 ß (deg.)00 sx' (in.) sz (in.)0.100 px' (in.) pz (in.)1.81 True CameraSensor 1Sensor 2 α (deg.) ß (deg.) sx' (in.) sz (in.) px' (in.) pz (in.) Two cameras were tested using three different arrangements of 6 beacons in known locations GLSDC was run on all three data sets simultaneously Results were similar for both cameras

Distortion Testing 13 Characterized the lens distortion by taking measurements across the FOV of each imager. For each imager, the camera was tilted +45⁰ or -45⁰ to isolate one sensor (set one imager vertical and one horizontal) The test camera was leveled in the plane of the test by ensuring that the vertical sensor outputs for a beacon at the FOV endpoints have equivalent values (horizontal imager is parallel to the optical table assuming the imagers are orthogonal).

Distortion Testing 14

Calibration Software 15 The calibration software will take in data previously gathered by the cameras. The cameras will gather data by viewing the calibration rig (a circle of beacons at many positions and orientations). The software will calibrate the positions and orientations of all of the cameras and their intrinsic parameters. (It will also calibrate the position of the rig in each frame, but this is of no use to us)

Calibration Software Initialization 16 y – measurements x – calibration parameters Main Function Read in Measurements Load Typical Intrinsic Parameters Assume Initial Rig State is Zero Estimate Camera States (Initial Rig State) Estimate Rig States (Camera States) While Difference in Y > Tolerance GLSDC Iteration (Measurements, Calibration Parameters) [Next Slide]

GLSDC Iteration 17 While Difference in Y > ToleranceGLSDC Iteration ( y, x ) Difference in Y = y – New Y New Y = Simulate System ( x ) H = Generate H ( x, Constraints ) Difference in X = ( H T W H ) -1 W H T * Difference in Y x = x + Difference in X W = Generate Weight Matrix y – measurements x – calibration parameters

Coding Progress 18 Created Matlab classes for the measurements and calibration parameters. These will handle access to all of the parameters and allow them to be used as vectors. Example: x is an object of class CalParams x.Vec acts as a vector for using in GLSDC x.IntParams(2,1).F allows access to the intrinsic parameter F of sensor 1 in camera 2. I have never worked with classes in Matlab, so the development has taken longer than initially expected, but we hope that a strong framework will pay off in the future.

Previous Semester Accomplishments 19 Familiarized with Matlab and GLSDC Researched and purchased computer hardware for QuadRotor offboard computing. Wrote software for finding an initial estimate of a camera’s position and orientation.

Tasks for Next Semester 20 Finish writing calibration software. Help to write software for determining positions using the data we gather from calibration. Help to write control software for the quadrotors.

Quad-Rotor Structure 21

22 X Y LPLP LVLV R R=131.6 mm L P = 8.3 mm L V = 14.3 mm Rotors PhaseSpace Beacon Vicon Reflector Basic Stabilization Kit Rods Upper Layer Rods Φ Use the top sign if beacon is to the clockwise side

Semester Summary 23 We designed a T joint, corner piece, center piece, and mounts for PhaseSpace beacons and Vicon reflectors in SolidWorks and had the parts created using a rapid prototyping machine. Outfitted 2 quad rotors with parts, beacons, reflectors, and electrical wiring. Created files for Vicon and PhaseSpace to allow them to recognize the individual quad rotors. Will outfit 3 more quad rotors next semester.

Quad-Rotor Video 24

Schematic 25

Current Board Layout 26

Completed Schematic – Learned circuit design basics – Learned EAGLE – Learned how to interpret datasheets Finalized Parts List Began Board Layout Semester Summary 27

Before End of Semester – Finish board layout – Order board and parts Early Next Semester – Assemble board – Test board – Make necessary changes Future Plans 28

Assemble calibration rig and perform comparative testing between Vicon and PhaseSpace using iGPS as truth Complete PhaseSpace calibration code Assemble & debug custom electronics board Interface quad-rotor board with off-board computing & Vicon/PhaseSpace feedback Outfit three additional quad-rotors (total of 5) Develop models & control laws for autonomous flight Next Semester 29

Questions? Comments? 30