Download presentation
Presentation is loading. Please wait.
1
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Perception l Sensors l Uncertainty l Features 4 PerceptionMotion Control Cognition Real World Environment Localization Path Environment Model Local Map "Position" Global Map
2
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example HelpMate, Transition Research Corp. 4.1
3
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example B21, Real World Interface 4.1
4
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example Robart II, H.R. Everett 4.1
5
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Savannah, River Site Nuclear Surveillance Robot 4.1
6
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations BibaBot, BlueBotics SA, Switzerland Pan-Tilt Camera Omnidirectional Camera IMU Inertial Measurement Unit Sonar Sensors Laser Range Scanner Bumper Emergency Stop Button Wheel Encoders 4.1
7
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Our new robot: Killian under development gripper with sensors: IR rangefinders strain gauge top sonar ring bottom sonar ring laser range-finder stereo vision laptop brain
8
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations General Classification (Table 4.1) 4.1.1
9
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations General Classification (Table 4.1, cont.) 4.1.1
10
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sensor Terminology l Sensitivity l Dynamic Range l Resolution l Bandwidth l Linearity l Error l Accuracy l Precision l Systematic Errors l Random Errors
11
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Active Ranging Sensors : Ultrasonic sensor 4.1.6 l l transmit a packet of (ultrasonic) pressure waves l l distance d of the echoing object can be calculated based on the propagation speed of sound c and the time of flight t. l l The speed of sound c (340 m/s) in air is given by where : ration of specific heats R: gas constant T: temperature in degree Kelvin 4.1.6
12
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor (time of flight, sound) Transmitted sound Analog echo signal Threshold Digital echo signal Integrated time Output signal integratorTime of flight (sensor output) threshold Wave packet Effective range: typically 12 cm to 5 m 4.1.6
13
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor (time of flight, sound) l typically a frequency: 40 - 180 kHz l generation of sound wave: piezo transducer transmitter and receiver separated or not separated l sound beam propagates in a cone like manner opening angles around 20 to 40 degrees regions of constant depth segments of an arc (sphere for 3D) Typical intensity distribution of a ultrasonic sensor 4.1.6
14
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 sensor l Range: 3 cm to 6 m l See also www.acroname.comwww.acroname.com
15
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 Characteristics
16
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 Characteristics (previous years)
17
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor Problems l Soft surfaces that absorb most of the sound energy l Undesired from non-perpendicular surfaces Specular reflection Foreshortening l Cross-talk between sensors 4.1.6 l l What if the robot is moving or the sensor is moving (on a servo motor)? l l What if another robot with the same sensor is nearby?
18
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Optical Triangulation (1D) Principle of 1D triangulation. distance is proportional to 1/x Target D L Laser / Collimated beam Transmitted Beam Reflected Beam P Position-Sensitive Device (PSD) or Linear Camera x Lens 4.1.6 http://www.acroname.com/robotics/parts/SharpGP2D12-15.pdf
19
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sharp Optical Rangefinder (aka ET sensor)
20
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sharp Optical Rangefinder (previous years)
21
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations IR Sensor (aka Top Hat sensor) Used for: Line following Barcode reader Encoder
22
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ground-Based Active and Passive Beacons l Elegant way to solve the localization problem in mobile robotics l Beacons are signaling guiding devices with a precisely known position l Beacon base navigation is used since the humans started to travel Natural beacons (landmarks) like stars, mountains or the sun Artificial beacons like lighthouses l The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology Already one of the key sensors for outdoor mobile robotics For indoor robots GPS is not applicable, l Major drawback with the use of beacons in indoor: Beacons require changes in the environment -> costly. Limit flexibility and adaptability to changing environments. 4.1.5
23
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Global Positioning System (GPS) Developed for military use Recently it became accessible for commercial applications 24 satellites (including three spares) orbiting the earth every 12 hours at a height of 20.190 km. Four satellites are located in each of six planes inclined 55 degrees with respect to the plane of the earth’s equators Location of any GPS receiver is determined through a time of flight measurement l Technical challenges: Time synchronization between the individual satellites and the GPS receiver Real time update of the exact location of the satellites Precise measurement of the time of flight Interferences with other signals 4.1.5
24
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Global Positioning System (GPS) 4.1.5 Satellites synchronize transmissions of location & current time GPS receiver is passive 4 satellites provide (x,y,z) and time correction
25
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (1) l Transmitted and received beams coaxial l Transmitter illuminates a target with a collimated beam l Receiver detects the time needed for round-trip l A mechanical mechanism with a mirror sweeps 2 or 3D measurement 4.1.6
26
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (2) Time of flight measurement l Pulsed laser measurement of elapsed time directly resolving picoseconds l Beat frequency between a frequency modulated continuous wave and its received reflection l Phase shift measurement to produce range estimation technically easier than the above two methods. 4.1.6
27
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (3) l Phase-Shift Measurement Where c: is the speed of light; f is the modulating frequency; D’ is the total distance covered by the emitted light for f = 5 Mhz (as in the A.T&T. sensor), = 60 meters = c/f 4.1.6
28
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (4) l Distance D, between the beam splitter and the target l where : phase difference between the transmitted and reflected light beams l Theoretically ambiguous range estimates since for example if = 60 meters, a target at a range of 5 meters = target at 65 meters (2.33) 4.1.6
29
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (5) l Confidence in the range (phase estimate) is inversely proportional to the square of the received signal amplitude. Hence dark, distant objects will not produce such good range estimated as closer brighter objects … 4.1.6
30
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) l Typical range image of a 2D laser range sensor with a rotating mirror. The length of the lines through the measurement points indicate the uncertainties. 4.1.6
31
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Vision-based Sensors: Sensing l Visual Range Sensors Depth from focus Stereo vision l Motion and Optical Flow l Color Tracking Sensors 4.1.8
32
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Vision-based Sensors: Hardware l CCD (light-sensitive, discharging capacitors of 5 to 25 micron) l CMOS (Complementary Metal Oxide Semiconductor technology) 4.1.8
33
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Color Tracking Sensors l Motion estimation of ball and robot for soccer playing using color tracking 4.1.8
34
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Robot Formations using Color Tracking
35
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Image representation (1,1) (640,480) R = (255,0,0) G = (0,255,0) B = (0,0,255) Yellow = (255,255,0) Magenta = (255,0,255) Cyan = (0,255,255) White = (255,255,255)
36
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Image Representation YCrCb illumination data stored in a separate channel (may be more resistant to illumination changes) R-G-B channels map to Cr-Y-Cb where Y = 0.59G + 0.31R + 0.11B (illumination) Cr = R-Y Cb = B-Y
37
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam l Ubicom SX28 microcontroller with 136 byes SRAM l 8-bit RGB or YCrCb l Max resolution: 352 x 288 pixels l Resolution is limited to 80 horizontal pixels x 143 vertical pixels because of the line by every other line processing. (1,1) (352,288) (80,143)
38
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam Operation l init_camera() auto-gain – adjusts the brightness level of the image white balance adjusts the gains of the color channels to accommodate for non-pure white ambient light l clamp_camera_yuv() point the camera at a white surface under your typical lighting conditions and wait about 15 seconds l trackRaw(rmin, rmax, gmin, gmax, bmin, bmax) l GUI interface for capturing images and checking colors
39
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam Tracking Global variables track_size … in pixelstrack_size … in pixels track_xtrack_x track_ytrack_y track_area … area of the bounding boxtrack_area … area of the bounding box track_confidencetrack_confidence (1,1) (80,143)
40
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam – Better tracking l Auto-gain Adjusts the brightness level of the image l White balance Adjusts the color gains on a frame by frame basis Aims for an average color of gray Works great until a solid color fills the image l One strategy – use CrYCb Aim at the desired target and look at a dumped frame (in GUI) Set the Cr and Cb bounds from the frame dump Set a very relaxed Y (illumination)
41
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Adaptive Human-Motion Tracking 4.1.8
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.