Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Perception l Sensors l Uncertainty l Features 4 PerceptionMotion.

Slides:



Advertisements
Similar presentations
Chapter3 Pulse-Echo Ultrasound Instrumentation
Advertisements

Mapping with the Electronic Spectrum
Kawada Industries Inc. has introduced the HRP-2P for Robodex 2002
Sensors.
7. Radar Meteorology References Battan (1973) Atlas (1989)
Part (2) Signal Generators & Sensors Lecture 5 د. باسم ممدوح الحلوانى.
Sensors For Robotics Robotics Academy All Rights Reserved.
Laser ​ Distance measurement ​ by Aleksi Väisänen ​ Pauli Korhonen.
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
Sensors For Robotics Robotics Academy All Rights Reserved.
1 CMPUT 412 Sensing Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A.
Sonar Chapter 9. History Sound Navigation And Ranging (SONAR) developed during WW II –Sound pulses emitted reflected off metal objects with characteristic.
Physics 102 Waves Moza M. Al-Rabban Professor of Physics Lecture 5 Traveling Waves March 6, 2006 Lecture 5 Traveling Waves March 6, 2006.
Sonar-Based Real-World Mapping and Navigation by ALBERTO ELFES Presenter Uday Rajanna.
How The GPS System Works. How the GPS System Works 24 satellites + spares 6 orbital planes 55° inclination Each satellite orbits twice every 24 hours.
California Standards 1h, 9d, 1. * Cartography – The science of map making. * A grid of the imaginary parallel and vertical lines are used to locate points.
1 Sensors BADI Year 3 John Errington MSc. 2 Sensors Allow a design to respond to its environment – e.g. a line following robot may use photosensors to.
Basics of Sensors. A sensor is a device which is used to sense the surroundings of it & gives some useful information about it. This information is used.
Perception 4 Sensors Uncertainty Features Localization Cognition
CNN Headline: “Police: 8-year-old shoots, kills elderly caregiver after playing video game.” 1 st sentence: “An 8-year-old Louisiana boy intentionally.
Range Sensors (time of flight) (1)
Introduction to Machine Vision Systems
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
2050 RoboCup Small Sizehttp://
Concept Design Review THE DUKES OF HAZARD CAMILLE LEGAULT, NEIL KENNEDY, OMAR ROJAS, FERNANDO QUIJANO, AND JIMMY BUFFI April 24, 2008.
SVY 207: Lecture 4 GPS Description and Signal Structure
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
An Introduction to Robotic Navigation ECE 450 Introduction to Robotics.
Resident Categorical Course
3D SLAM for Omni-directional Camera
An Introduction to Mobile Robotics CSE350/ Sensor Systems (Continued) 2 Sep 03.
Echo Cancellation Chapter 4. Echo : Echo is the repetition of a signal back to the transmitter; either due to a coupling between the loudspeaker and microphone.
10. Satellite Communication & Radar Sensors
remote sensing electromagnetic spectrum frequency Landsat satellite Objectives Compare and contrast the different forms of radiation in the electromagnetic.
Display of Motion & Doppler Ultrasound
Sensing Today: Using Sensors Monday: Quiz on Controls and Sensing Rat Robots Scientists Develop Remote-Controlled Rats "The animal is not only doing something.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Cmput412 3D vision and sensing 3D modeling from images can be complex 90 horizon 3D measurements from images can be wrong.
Types of Traveling Waves
Autonomous Mobile Robots CPE 470/670 Lecture 6 Instructor: Monica Nicolescu.
MACHINE VISION Machine Vision System Components ENT 273 Ms. HEMA C.R. Lecture 1.
1 MADRID Measurement Apparatus to Distinguish Rotational and Irrotational Displacement Rafael Ortiz Graduate student Universidad de Valladolid (Spain)
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Autonomous Robots Vision © Manfred Huber 2014.
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Intelligent Robotics Today: Vision & Time & Space Complexity.
RGB, HSL, HSV HSL, HSV: easier to define, closer to human vision
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Time Zones Because Earth takes about 24 hours to rotate once on its axis, it is divided into 24 times zones, each representing a different hour. Latitude.
Sensor for Mobile Robots many slides from Siegwart, Nourbaksh, and Scaramuzza, Chapter 4 and Jana Kosecka.
U NIVERSITY OF J OENSUU F ACULTY OF F ORESTRY Introduction to Lidar and Airborne Laser Scanning Petteri Packalén Kärkihankkeen ”Multi-scale Geospatial.
Physics: light waves. Properties and Sources of Light Key Question: What are some useful properties of light?
The Global Positioning System
Laser ranging, mapping, and imaging systems for exploration robots Alex Styler.
- Usable range of 0 to 6 feet. - Find dark or bright areas.
Sensors For Robotics Robotics Academy All Rights Reserved.
Acoustic mapping technology
GEOGRAPHIC INFORMATION SYSTEMS & RS INTERVIEW QUESTIONS ANSWERS
The Earth is {image} meters from the sun
Sensors For Robotics Robotics Academy All Rights Reserved.
James Donahue EE 444 Fall LiDAR James Donahue EE 444 Fall
Robot Teknolojisine Giriş Yrd. Doç. Dr. Erkan Uslu, Doç. Dr
دکتر سعید شیری قیداری & فصل 4 کتاب
دکتر سعید شیری قیداری & فصل 4 کتاب
Perception 4 Sensors Uncertainty Features Localization Cognition
دکتر سعید شیری قیداری & فصل 4 کتاب
Day 32 Range Sensor Models 11/13/2018.
Range Imaging Through Triangulation
Surveying Instruments
Distance Sensor Models
Presentation transcript:

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Perception l Sensors l Uncertainty l Features 4 PerceptionMotion Control Cognition Real World Environment Localization Path Environment Model Local Map "Position" Global Map

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example HelpMate, Transition Research Corp. 4.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example B21, Real World Interface 4.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Example Robart II, H.R. Everett 4.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Savannah, River Site Nuclear Surveillance Robot 4.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations BibaBot, BlueBotics SA, Switzerland Pan-Tilt Camera Omnidirectional Camera IMU Inertial Measurement Unit Sonar Sensors Laser Range Scanner Bumper Emergency Stop Button Wheel Encoders 4.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Our new robot: Killian under development gripper with sensors: IR rangefinders strain gauge top sonar ring bottom sonar ring laser range-finder stereo vision laptop brain

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations General Classification (Table 4.1) 4.1.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations General Classification (Table 4.1, cont.) 4.1.1

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sensor Terminology l Sensitivity l Dynamic Range l Resolution l Bandwidth l Linearity l Error l Accuracy l Precision l Systematic Errors l Random Errors

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Active Ranging Sensors : Ultrasonic sensor l l transmit a packet of (ultrasonic) pressure waves l l distance d of the echoing object can be calculated based on the propagation speed of sound c and the time of flight t. l l The speed of sound c (340 m/s) in air is given by where : ration of specific heats R: gas constant T: temperature in degree Kelvin 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor (time of flight, sound) Transmitted sound Analog echo signal Threshold Digital echo signal Integrated time Output signal integratorTime of flight (sensor output) threshold Wave packet Effective range: typically 12 cm to 5 m 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor (time of flight, sound) l typically a frequency: kHz l generation of sound wave: piezo transducer  transmitter and receiver separated or not separated l sound beam propagates in a cone like manner  opening angles around 20 to 40 degrees  regions of constant depth  segments of an arc (sphere for 3D) Typical intensity distribution of a ultrasonic sensor 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 sensor l Range: 3 cm to 6 m l See also

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 Characteristics

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations SRF10 Characteristics (previous years)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ultrasonic Sensor Problems l Soft surfaces that absorb most of the sound energy l Undesired from non-perpendicular surfaces  Specular reflection  Foreshortening l Cross-talk between sensors l l What if the robot is moving or the sensor is moving (on a servo motor)? l l What if another robot with the same sensor is nearby?

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Optical Triangulation (1D) Principle of 1D triangulation.  distance is proportional to 1/x Target D L Laser / Collimated beam Transmitted Beam Reflected Beam P Position-Sensitive Device (PSD) or Linear Camera x Lens

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sharp Optical Rangefinder (aka ET sensor)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Sharp Optical Rangefinder (previous years)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations IR Sensor (aka Top Hat sensor) Used for: Line following Barcode reader Encoder

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Ground-Based Active and Passive Beacons l Elegant way to solve the localization problem in mobile robotics l Beacons are signaling guiding devices with a precisely known position l Beacon base navigation is used since the humans started to travel  Natural beacons (landmarks) like stars, mountains or the sun  Artificial beacons like lighthouses l The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology  Already one of the key sensors for outdoor mobile robotics  For indoor robots GPS is not applicable, l Major drawback with the use of beacons in indoor:  Beacons require changes in the environment -> costly.  Limit flexibility and adaptability to changing environments

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Global Positioning System (GPS)  Developed for military use  Recently it became accessible for commercial applications  24 satellites (including three spares) orbiting the earth every 12 hours at a height of km.  Four satellites are located in each of six planes inclined 55 degrees with respect to the plane of the earth’s equators  Location of any GPS receiver is determined through a time of flight measurement l Technical challenges:  Time synchronization between the individual satellites and the GPS receiver  Real time update of the exact location of the satellites  Precise measurement of the time of flight  Interferences with other signals 4.1.5

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Global Positioning System (GPS) Satellites synchronize transmissions of location & current time GPS receiver is passive 4 satellites provide (x,y,z) and time correction

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (1) l Transmitted and received beams coaxial l Transmitter illuminates a target with a collimated beam l Receiver detects the time needed for round-trip l A mechanical mechanism with a mirror sweeps  2 or 3D measurement 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (2) Time of flight measurement l Pulsed laser  measurement of elapsed time directly  resolving picoseconds l Beat frequency between a frequency modulated continuous wave and its received reflection l Phase shift measurement to produce range estimation  technically easier than the above two methods

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (3) l Phase-Shift Measurement Where c: is the speed of light; f is the modulating frequency; D’ is the total distance covered by the emitted light  for f = 5 Mhz (as in the A.T&T. sensor), = 60 meters = c/f 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (4) l Distance D, between the beam splitter and the target l where   : phase difference between the transmitted and reflected light beams l Theoretically ambiguous range estimates  since for example if = 60 meters, a target at a range of 5 meters = target at 65 meters (2.33) 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) (5) l Confidence in the range (phase estimate) is inversely proportional to the square of the received signal amplitude.  Hence dark, distant objects will not produce such good range estimated as closer brighter objects … 4.1.6

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Laser Range Sensor (time of flight, electromagnetic) l Typical range image of a 2D laser range sensor with a rotating mirror. The length of the lines through the measurement points indicate the uncertainties

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Vision-based Sensors: Sensing l Visual Range Sensors  Depth from focus  Stereo vision l Motion and Optical Flow l Color Tracking Sensors 4.1.8

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Vision-based Sensors: Hardware l CCD (light-sensitive, discharging capacitors of 5 to 25 micron) l CMOS (Complementary Metal Oxide Semiconductor technology) 4.1.8

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Color Tracking Sensors l Motion estimation of ball and robot for soccer playing using color tracking 4.1.8

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Robot Formations using Color Tracking

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Image representation (1,1) (640,480) R = (255,0,0) G = (0,255,0) B = (0,0,255) Yellow = (255,255,0) Magenta = (255,0,255) Cyan = (0,255,255) White = (255,255,255)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Image Representation YCrCb illumination data stored in a separate channel (may be more resistant to illumination changes) R-G-B channels map to Cr-Y-Cb where Y = 0.59G R B (illumination) Cr = R-Y Cb = B-Y

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam l Ubicom SX28 microcontroller with 136 byes SRAM l 8-bit RGB or YCrCb l Max resolution: 352 x 288 pixels l Resolution is limited to 80 horizontal pixels x 143 vertical pixels because of the line by every other line processing. (1,1) (352,288) (80,143)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam Operation l init_camera()  auto-gain – adjusts the brightness level of the image  white balance adjusts the gains of the color channels to accommodate for non-pure white ambient light l clamp_camera_yuv()  point the camera at a white surface under your typical lighting conditions and wait about 15 seconds l trackRaw(rmin, rmax, gmin, gmax, bmin, bmax) l GUI interface for capturing images and checking colors

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam Tracking Global variables track_size … in pixelstrack_size … in pixels track_xtrack_x track_ytrack_y track_area … area of the bounding boxtrack_area … area of the bounding box track_confidencetrack_confidence (1,1) (80,143)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations CMU cam – Better tracking l Auto-gain  Adjusts the brightness level of the image l White balance  Adjusts the color gains on a frame by frame basis  Aims for an average color of gray  Works great until a solid color fills the image l One strategy – use CrYCb  Aim at the desired target and look at a dumped frame (in GUI)  Set the Cr and Cb bounds from the frame dump  Set a very relaxed Y (illumination)

Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Adaptive Human-Motion Tracking 4.1.8