ROBOT LOCALISATION & MAPPING: MAPPING & LIDAR By James Mead
Content Project Overview The Team Simultaneous Localisation & Mapping (SLAM) Hardware Mapping Conclusion Questions
Project Overview Environmental positioning, humans have it easy! The use of sensors: GPS Wheel encoder Accelerometer Gyroscope Infrared ranging sensor Ultrasonic sensor Fuse the data in software to create useable information about the robots environment
Project Breakdown Divided into 3 major areas. Mapping & LIDAR – myself Kinect & data analysis – Scott Penley Navigation & path planning – Ken Birbeck Work together to combine all 3 aspects into one overall project. Goal: Create a robot capable of Simultaneous Localisation and Mapping (SLAM).
What is SLAM? 2 aspects: Robot’s current position and the position of objects in the environment. Chicken or egg (robot’s position or environment layout), which comes first? My task: overcome this through the use of sensors and mathematical calculations.
The Hardware Lynx Robot IMU – Inertial measurement unit Used for determining pitch, roll, yaw and acceleration Xbox Kinect Fit-PC Wheel encoders LIDAR – Light Detection and Ranging (Laser Rangefinder) Returns highly accurate distance data of the environment SLAM needs highly accurate sensors no GPS or digital compass
A suitable LIDAR Range Speed Accuracy (error tolerance) Sweep angle, Weight, Dimensions, Power consumption & laser source. Price! Hokuyo URG-04LX
Mapping Metric mapping & Topological mapping Grid Occupancy Mapping Map divided up into cells Robot’s position needs to be known accurately(X,Y, θ ) Object location converted from Polar to Cartesian Apply filters to reduce noise & correct errors
Conclusion Familiarise with hardware Fully understand the LIDAR Develop a method for Mapping Metric or Topological mapping Incorporate the use of filters Plot the data in software Work as a team! This project will not work without Scott or Ken Any Questions?