Download presentation
Presentation is loading. Please wait.
Published byEthan Taylor Modified over 9 years ago
1
Sensor Fusion on TerraMax Dr. Zhiyu Xiang Andy Chien Prof. Umit Ozguner Feb. 17th, 2004, Tuesday Lecture on EE753.02
2
System Overview CAMERAS MONO-VISION computer Linux STEREO- VISION computer Linux Sensor and Sensor Fusion computer Linux LADARs Radars Short distance sensors Map and high level path planning computer High level Control Low level Sensing Low Level Control QNX GPS Alarm monitoring and heartbeat Compass INS Internal sensors External switches E-Stop Brake actuators Throttle Control Steering motor Shifting CAMERAS
3
High Level Sensor System Overview Laser Radar (LADAR) Laser Radar (LADAR) Radar Mono Vision Stereo Vision Sonar DGPS INS COMPASSCOMPASS COMPASSCOMPASS Position Fusion Position Fusion Sensor Fusion
4
Why Sensor Fusion? Sensors have different perceptive ability against environment; Sensors have different field of view; Even with the same type of sensors, we can: Enlarge the entire field of view by using more sensors; Accumulate the information acquired at different time to achieve better perception.
5
GPS GPS Receiver and Antenna
6
The Information available from GPS 1. Position in the Geodetic Coordinates (latitude, longitude, Altitude) 2. Rate information. (Horizontal Speed, orientation to the true north.) 3. The GPS Precise Time.
7
GPS - Advantages Advantages: Satellite-based radio navigation system; Provide information to the users of GPS receivers worldwide in all weather conditions, free of charge; Reduces overall system costs by eliminating the need for a separate base station to obtain decimeter-level accuracy Protects against shock, water, and dust, extending the life of the receiver Virtually eliminates the effects of multipath using NovAtel’s patented Pulse Aperture Correlator™ (PAC) tracking technology
8
GPS - Features - Accepts OmniSTAR L-band differential corrections (subscription required) Shock, water, and dust resistant Three RS-232 serial ports capable of rates up to 230,400 bps Power and communication status LED indicators Field-upgradeable firmware
9
INS System Dynamic Roll, Pitch Body to Earth Frame Angles 3-Axis Vehicle Body Rates 3-Axis Vehicle Body or Earth Accelerations
10
INS - Features Fiber Optic Gyro Stability < 20°/hr Fully Compensated Angular Rate and Linear Acceleration Outputs SAE (Earth Coordinate) Navigation Frame Automotive Compatible 10-30 VDC Input Supply Analog & Digital Outputs
11
One Example Application of INS Inertial systems are frequently used in actively stabilized platforms. Actively stabilized means that a series of motors and gimbals in conjunction with a inertial sensor work actively to hold the platform stationary. Active stabilization systems are typically used to point cameras or antennae on a moving plane, helicopter, ship, train, or even RV. There are also cases where cameras permanently attached to the ground are stabilized for wind and vibrational forces.
12
Compass The HMR3000 Digital Compass Module is a three-axis compass featuring 0.5 degree accuracy and a fluidic tilt sensor for +/- 45 degree compensation and digital serial bus interface (RS-485 or RS-232 options).
13
Why Fuse GPS/INS/Compass? Accuracy. INS can lead to the unbounded growth of its error, even with the smallest amount of error in its measurements. This gives the rise to the need for an augmentation of the measurements by external aiding sources to periodically correct the errors. GPS can do that, with its bounded measurement error.
14
Why Fuse GPS/INS/Compass? (II) Data Output Rate. The data output rate of GPS is 10 Hz at the most, which is insufficient for the positioning of a vehicle under autonomous control. On the contrary, the output of INS is much higher, even more than 100Hz on the digital signal output and no frequency limit on analog signal output. The integration of both can therefore satisfy the data output rate requirement.
15
Why Fuse GPS/INS/Compass? (III) Data Availability. GPS is a line of sight, radio navigation system, and therefore GPS measurements are subject to signal outages, interference, and jamming, whereas INS is a self-contained, non-jammable system that is completely independent of the surrounding environment, and hence virtually immune to external disturbances. Therefore, INS can continuously provide navigation information when GPS experiences short-term loss of its signals.
16
Why Fuse GPS/INS/Compass? (IIII) Compass can provide yaw, pitch and roll information continuously independent of other sensors. Although the data output rate is less than 20Hz, it can correct the yaw information integrated from INS yaw rate periodically.
17
Fusion Algorithm of GPS/INS/Compass DGPS Antenna DGPS Receiver RS-232 Hardware Interface X Accelerometer Z Accelerometer Yaw Rate Pitch Rate roll Rate INS SYSTEM Yaw Pitch Roll COMPASSCOMPASS Position: X,Y,Z Speed: Yaw GPS Output Extend Kalman Filter Algorithm Vehicle Status: Position: X,Y,Z; Speed: ; Accelerator: ; Yaw, Pitch, Roll; Rate of Yaw, Pitch and roll. PC
18
Vision System Road and Free Space Finding by Mono- Vision
19
LADAR System SICK LMS30206 Outdoor Version
20
Performance of LADAR 1.Angular Resolution: 1° / 0,5° / 0,25° 2. Response Time (ms): 13 / 26 / 53 3. Resolution (mm) : 10 4. Systematic Error (mm mode): 35 5. Statistical Error (1 Sigma): 10mm 6. Max. Distance (m): 80 7. Transfer rate: 9.6/19.2/38.4/500 kBaud
21
Obstacle Detection by LADAR LADAR can tell the distance between the obstacle and the center of the LADAR
22
Some Scenarios for Vertical Scanning Laser (I) L P h W Scanning vertically The minimum width and depth of the ditch can be decided by the wheel radius and the speed of the vehicle. The higher the Ladar is installed, the farther the ditch could be detected.
23
Some Scenarios for Vertical Scanning Laser (II) h h L
24
Radar System Provides Information of objects in the lane up to 350 feet ahead. Advanced forward looking Doppler radar (24.725 GHz), providing distance, relative speed. Operate effectively night or day, in rain, fog, dust, or snow.
25
Ultrasonic Sensors For short range obstacle detection.(Less than 5 meters.) Accuracy affected by temperature, moisture, etc.
26
What kind of confliction may happen between sensors? IN Data Layer: same objects in the environment, their position declared from sensors may be different (I.e., one sensor tells the range of 30 meter while the other tells 28 meter); In Decision Layer: The decision of the observation may conflict with each other.(I.e, No Obstacle VS. Obstacle).
27
How to solve the confliction between sensors? (I) How does the confliction happen? Different perceptive character of sensors; Range, accuracy, field of view, imaging sensor VS. range sensor, etc.. Changing of the surrounding environment; False data input; Thresholds on processing algorithms; Different algorithms used.
28
How to solve the confliction between sensors? (II) Measures to deal with the conflictions: For data layer: Using Target Tracking techniques (Extend Kalman Filter); For decision layer: Assign a confidence to each decision made by the preprocessing of each sensor; Deduce the final decision with a deliberately designed Deducing Table (Evidence Theory based deducing).
29
Map for High Level Sensor Fusion East North 50m -50m 50m High confidence of occupy Low confidence of occupy High confidence of empty Unknown area Vision Map Laser Map -50m Type of the cells: ROD COV POB NOB MOB UKN
30
Algorithms for Fusion Map Updating Map initialization Get Mono-Vision Information at Fusion map movement according to the GPS displacement between and Discard cells outside the map and give initial values to newly shift-in cells Broadcast confidence value to neighboring cells according to the model of position errors. (Gaussian noises) Get new Observations from Stereo Vision, LADAR, Radar and Sonar modules. Transforming the coordinates of different sensor modules to Sensor Map coordinates by using the calibration parameters. Fusing the Sensor Map into the Fusion map by using the Dempster-Shafer Evidence theory. Multi-sensor calibration
31
Deducing Table(I) Information from Sensor Map Informati on from Fusion Map RODCOVPOBNOBMOBUKN ROD (1) COV if (a) ROD if (b) (1) POB if (g), results (3); ROD if (h), results (5). NOB if (g), results (3); ROD if (h), results (5). MOB (3) ROD (2) COVROD if (a) COV if (b) (1) COV (1) POB if (g), results (3); ROD if (h), results (5). NOB if (g), results (3); COV if (h), results (5). MOB (3) COV (2) POBROD if (e), results (3); POB if (f), Results (4). COV if (e), results (3); POB if (f), results (4). POB (1) NOB if (c) POB if (d) (1) MOB (3) POB (2) NOBROD if (e), results (3); NOB if (f), results (4) COV if (e), Results (3); NOB if (f), Results (4). POB if (c) NOB if (d) (1) NOB (1) MOB (3) NOB (2) MOBNo prediction exists in the fusion map, replaced by UKN. UKNROD (3) COV (3) POB (3) NOB (3) MOB (3) UKN (3)
32
Deducing Table(II)
33
Examples of Sensor Fusion Results (I)
34
Examples of Sensor Fusion Results (II)
35
Examples of Sensor Fusion Results (III)
36
Summary By sensor fusion, the complementary information from different sensors are fully and best combined; Information acquired at different time is accumulated; Sensors are integrated together and a best decision was made upon that.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.