Download presentation
Presentation is loading. Please wait.
Published byKrystal Broom Modified over 9 years ago
1
Odometry Error Modeling Three noble methods to model random odometry error
2
Two types of odometer errors Systematic error – Can be avoided by calibrating the wheel encoder – Can accounted for in software Random error – Difficult to calibrate or find a closed form formula – Mathematically modeled – Significant source of confidence level degradation
3
Systematic error UMBmark test – Let the robot follow a defined path several times – Compare the deviation of the robot in CW/CCW – Calculate the weighted Cartesian offsets – Correct error in software/hardware
4
Example, UMBmark with square of side D Robot covers a square of side D, n times in both CW and CCW sense. At the end of each lap, the error in x and y direction are recorded.
5
Random error modeling 1 This model expresses the covariance matrix in terms of four parameters: E T E R K theta K rho Here are the relations: – V theta = K theta * (Translation speed) – V rho = K rho * (Translational speed) – Note: V stands for variance The other two parameters describe systematic errors (thus uninteresting)
6
How to determine parameters Perform simple robot movements, such as moving back and fro for a certain number of times. Measure several “observables”, quantities to be measured from the robot’s movement Calculate the four parameters from the measured observables. Certain observables give a better estimation of the parameters. The paper lists best observable to estimate each model parameter.
7
Model 2 Any robot path can be decomposed into three primitive movements: circular motion, turn on spot, straight motion. This model finds closed form formula for error propagation for circular motion. Using this model the robot updates its odometery covariance matrix only when the robot changes type of motion. So faster computation and independent of sequence of action
8
Model 3 This model takes into account the variation in the wheel diameter and wheel separation. A model that incorporates the random wheel variation to covariance matrix was found. Result shows final error propagation is just the sum of covariance propagation with no wheel variation and a new error factor. The new error factor updates the wheel- variation-covariance matrix (Q)
9
Result of model 3
10
Odometry Error Correction Nonsystematic Error Correction – OmniMate Design – Sensor Fusion Systematic Odometry Error Correction – Systematic Error Compensation
11
Systematic and Nonsystematic Errors Systematic errors: errors in design – Unequal wheel diameters – Misalignment of wheels – Limited sampling resolution – Uncertainty about effective wheelbase Nonsystematic: errors due to environment -Travel over uneven floor -Unexpected object on the floor -Wheel slippage
12
Nonsystematic Error Correction: OmniMate Design Freedom mobile platform omnidirectional capabilities Two differential-drive TRC LabMate platforms Front and rear can rotate around A and B
13
OmniMate Design (continue)
14
OmniMate (continue) IPEC: Internal Position Error Correction (OmniMate control System) Detect automatically and correct nonsystematic odometry errors
15
Nonsystematic Error Correction: Sensor Fusion Fusion of data from gyroscope and odometry systems for a long time travel – If dataG – dataE > prefix threshold, dataG is used to compute orientation angle. Otherwise use dataE – Threshold adapted to different environment – Kalman filters to reduce noise and measure angular position
16
Systematic Error Correction Substantial odometry error: effect on straight line motion – Ed=DR/DL Wheelbase error: effect when turning Eb=bactual/bnominal: expressed as a function of the nominal value
17
Systematic Odometry Errors Compensation R=(L/2)/sin(beta/2) Eb=(R+b/2)/(R-b/2)= DR/DL Da=(DR+DL)/2 DL=2Da/(Ed+1) DR=2Da/((1/Ed)+1) Correction Factors: – Cl=2/(Ed+1) – Cr=2/((1/Ed)+1)
18
Simultaneous Location and Mapping (SLAM)
19
Purpose Replace the need to manually create and update maps – Map becomes invalid if an object begins moving or the map is rearranged – Allows for the robot to be truly autonomous
20
Primary Goal Be able to start from an arbitrary point autonomously explore the local environment using its on-board sensors, analyze the location and map the area determine its actual location on the self constructed map
21
Popular Implementations Visual-SLAM algorithm on the extended Kalman filter Graph-SLAM algorithm Particle-filter slam
22
Visual-SLAM algorithm on the extended Kalman filter Combines the extended Kalman filter results with robot pose and feature locations Data is updated each iteration which leads to high computation costs with a lot of features Implementations have been progressive to allow for maps to be broken down to more manageable datasets
23
Graph-based SLAM Constraints between robot and feature are seen as “soft” constraints Creates a grid map of the robot location and features Capable of solving the SLAM problem by calculating the area of least required engergy on the grid
24
Particle filter Slam Represents the object location certainty in samples instead of Gaussian probability Good for nonlinear transformations and any type of non-Gaussian distribution
25
Handling Errors Any error in one iteration will continue to grow as more iterations are done Common approach for correcting this error is to return to a feature who’s location and characteristics are well know This minimizes uncertainty across the rest of the map as well
26
Room for Improvement Dynamic object pose a large problem for SLAM – Robot is quickly working within an invalid map – Attempt at a solution is to treat a dynamic object differently from a static feature on the map (outlier) – Any successful attempt at determining moving objects and their destination would drastically improve the SLAM process
27
Room for Improvement SLAM is very sensitive to errors This error is especially detrimental when the robot attempts to minimize uncertainty by returning to a know object Commonly used laser rangefinders are prone to missing some of the feature’s details – The use of an on board camera is useful for features – SIFT Function
28
Q&A
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.