Presentation is loading. Please wait.

Presentation is loading. Please wait.

PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace.

Similar presentations


Presentation on theme: "PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace."— Presentation transcript:

1 PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace Optical Navigation Adaptation (PSONA) Team Faculty Mentors: Ms. Magda Lagoudas, Dr. John L. Junkins, Dr. Raktim Bhattacharya

2 Overview PSONA Semester Goals PhaseSpace calibration (Albert Soto & Zach Sunberg) Outfitting quad-rotors with PhaseSpace/Vicon (Erin Mastenbrook) Quad-rotor custom electronics board design (Winnie Lung & Kate Stuckman) A look at the next semester 2

3 Semester Goals Characterizing and Calibrating PhaseSpace Characterize the PhaseSpace system Develop a method of calibration Evaluate the accuracy and reliability of PhaseSpace by comparing to Indoor GPS Compare all results to the capabilities of Vicon 3

4 Semester Goals Demonstrate the capability of utilizing PhaseSpace as a vision-based localization solution in multiple, UAV coordination Equip two DraganFlyer Quad-Rotors and LASR Lab to enable multi-vehicle autonomous flight – Next semester: develop user interfaces and control algorithms for autonomous flight UAV Demonstration 4

5 PhaseSpace Calibration 5 Error in the PhaseSpace system arises primarily from optical noise and from misalignments in the camera’s internal geometry (biases). By determining the true geometry of the camera, bias error may be compensated for, resulting in a more accurate “best guess” of the beacon location. Noise characterization then describes how reliable the “best guess” measurement is.

6 Approach 6 Develop a model that describes a camera’s output as a function of its intrinsic parameters and beacon position. Construct an algorithm to determine more accurate (or less biased) camera parameters. Using these improved parameters, run several tests to gather data reflecting the correlation of noise with beacon position in each dimension.

7 Mathematical Model 7 Extrinsic parameters: u – beacon location u 0 – mounting point Camera's Euler angles Intrinsic parameters: p z – pinhole depth s z – sensor center depth α – angle of sensor's axis from horizontal within ideal plane ß – angle between sensor and ideal plane p x' – distance from mount to pinhole along sensor's axis s x' – distance from mount to sensor center along sensor's axis pinhole

8 Mathematical Model 8 The following set of four equations describes an output pixel in terms of the basic intrinsic and extrinsic parameters. b 0, f, and δ are intermediate values. R is the rotation matrix containing the camera's Euler angles. ξ is the pixel value returned by the camera.

9 Calibration Simulation 9 Ideal CameraSensor 1Sensor 2 α (deg.)-4545 ß (deg.)00 sx' (in.)1.77 sz (in.).100 px' (in.)1.77 pz (in.)1.81 True CameraSensor 1Sensor 2 α (deg.)-47.542 ß (deg.)2.0-1.5 sx' (in.)1.811.74 sz (in.).105.095 px' (in.)1.761.80 pz (in.)1.801.84 The ideal intrinsic parameter values shown below were passed into the GLSDC algorithm as initial guess values. Biased “true” values were used to generate output for eight beacon locations.

10 Calibration Simulation 10 Error (GLSDC)AverageMax α0.000% ß sx'0.000% sz0.000% px'0.000% pz0.000% Shown below is the % error for the ideal and GLSDC-derived cameras vs. the true camera. For this trial, noiseless measurements from the hypothetical true camera were used and the parameters were found accurately to several decimal places. Error (ideal)Sensor 1Sensor 2 α5.263%7.143% ß100.0% sx'2.210%1.724% sz4.762%5.263% px'0.568%1.667% pz0.556%1.630%

11 Calibration Simulation 11 Error (ideal)Sensor 1Sensor 2 α5.263%7.143% ß100.0% sx'2.210%1.724% sz4.762%5.263% px'0.568%1.667% pz0.556%1.630% Error (GLSDC)AverageMax α0.003%0.010% ß0.343%0.443% sx'0.043%0.101% sz2.896%4.740% px'0.045%0.100% pz0.160%0.273% Next, the noise was set to a value slightly greater than the maximum observed in testing. The average and maximum % errors from a set of 5 trials are shown. In all cases, the GLSDC code produced more accurate parameters, often by a large margin. Sensor depth was estimated less accurately than the other parameters.

12 Calibration Testing 12 Ideal CameraSensor 1Sensor 2 α (deg.)-4545 ß (deg.)00 sx' (in.)0.707-0.707 sz (in.)0.100 px' (in.)0.707-0.707 pz (in.)1.81 True CameraSensor 1Sensor 2 α (deg.) -44.0845.49 ß (deg.) 2.13-0.75 sx' (in.) 0.462-0.566 sz (in.) -2.370-2.424 px' (in.)0.461-0.566 pz (in.)-0.512-0.560 Two cameras were tested using three different arrangements of 6 beacons in known locations GLSDC was run on all three data sets simultaneously Results were similar for both cameras

13 Distortion Testing 13 Characterized the lens distortion by taking measurements across the FOV of each imager. For each imager, the camera was tilted +45⁰ or -45⁰ to isolate one sensor (set one imager vertical and one horizontal) The test camera was leveled in the plane of the test by ensuring that the vertical sensor outputs for a beacon at the FOV endpoints have equivalent values (horizontal imager is parallel to the optical table assuming the imagers are orthogonal).

14 Distortion Testing 14

15 Calibration Software 15 The calibration software will take in data previously gathered by the cameras. The cameras will gather data by viewing the calibration rig (a circle of beacons at many positions and orientations). The software will calibrate the positions and orientations of all of the cameras and their intrinsic parameters. (It will also calibrate the position of the rig in each frame, but this is of no use to us)

16 Calibration Software Initialization 16 y – measurements x – calibration parameters Main Function Read in Measurements Load Typical Intrinsic Parameters Assume Initial Rig State is Zero Estimate Camera States (Initial Rig State) Estimate Rig States (Camera States) While Difference in Y > Tolerance GLSDC Iteration (Measurements, Calibration Parameters) [Next Slide]

17 GLSDC Iteration 17 While Difference in Y > ToleranceGLSDC Iteration ( y, x ) Difference in Y = y – New Y New Y = Simulate System ( x ) H = Generate H ( x, Constraints ) Difference in X = ( H T W H ) -1 W H T * Difference in Y x = x + Difference in X W = Generate Weight Matrix y – measurements x – calibration parameters

18 Coding Progress 18 Created Matlab classes for the measurements and calibration parameters. These will handle access to all of the parameters and allow them to be used as vectors. Example: x is an object of class CalParams x.Vec acts as a vector for using in GLSDC x.IntParams(2,1).F allows access to the intrinsic parameter F of sensor 1 in camera 2. I have never worked with classes in Matlab, so the development has taken longer than initially expected, but we hope that a strong framework will pay off in the future.

19 Previous Semester Accomplishments 19 Familiarized with Matlab and GLSDC Researched and purchased computer hardware for QuadRotor offboard computing. Wrote software for finding an initial estimate of a camera’s position and orientation.

20 Tasks for Next Semester 20 Finish writing calibration software. Help to write software for determining positions using the data we gather from calibration. Help to write control software for the quadrotors.

21 Quad-Rotor Structure 21

22 22 X Y LPLP LVLV R R=131.6 mm L P = 8.3 mm L V = 14.3 mm Rotors PhaseSpace Beacon Vicon Reflector Basic Stabilization Kit Rods Upper Layer Rods Φ Use the top sign if beacon is to the clockwise side

23 Semester Summary 23 We designed a T joint, corner piece, center piece, and mounts for PhaseSpace beacons and Vicon reflectors in SolidWorks and had the parts created using a rapid prototyping machine. Outfitted 2 quad rotors with parts, beacons, reflectors, and electrical wiring. Created files for Vicon and PhaseSpace to allow them to recognize the individual quad rotors. Will outfit 3 more quad rotors next semester.

24 Quad-Rotor Video 24

25 Schematic 25

26 Current Board Layout 26

27 Completed Schematic – Learned circuit design basics – Learned EAGLE – Learned how to interpret datasheets Finalized Parts List Began Board Layout Semester Summary 27

28 Before End of Semester – Finish board layout – Order board and parts Early Next Semester – Assemble board – Test board – Make necessary changes Future Plans 28

29 Assemble calibration rig and perform comparative testing between Vicon and PhaseSpace using iGPS as truth Complete PhaseSpace calibration code Assemble & debug custom electronics board Interface quad-rotor board with off-board computing & Vicon/PhaseSpace feedback Outfit three additional quad-rotors (total of 5) Develop models & control laws for autonomous flight Next Semester 29

30 Questions? Comments? 30


Download ppt "PhaseSpace Optical Navigation Development & Multi-UAV Closed- Loop Control Demonstration Texas A&M University Telecon with Boeing December 7, 2009 PhaseSpace."

Similar presentations


Ads by Google