Common Sensing Techniques for Reactive Robots (12-11-7) Sungmin Lee ( 이성민 ) Division of Electronic Engineering, Chonbuk National University Intelligent.

Slides:



Advertisements
Similar presentations
Mapping with the Electronic Spectrum
Advertisements

Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
Light and Atoms Chapter 3.
Sensors For Robotics Robotics Academy All Rights Reserved.
Digital Image Fundamentals Selim Aksoy Department of Computer Engineering Bilkent University
Sensors For Robotics Robotics Academy All Rights Reserved.
COLORCOLOR A SET OF CODES GENERATED BY THE BRAİN How do you quantify? How do you use?
Last time we saw: DC motors – inefficiencies, operating voltage and current, stall voltage and current and torque – current and work of a motor Gearing.
Embedded System Lab Kim Jong Hwi Chonbuk National University Introduction to Intelligent Robots.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Connecting VEX and ROBOTC
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
CS292 Computational Vision and Language Visual Features - Colour and Texture.
Simultaneous Localization and Map Building System for Prototype Mars Rover CECS 398 Capstone Design I October 24, 2001.
Digital Images The nature and acquisition of a digital image.
Motion detector ​ Bikesh Shrestha ​ Ari Rajamäki.
Basics of Sensors. A sensor is a device which is used to sense the surroundings of it & gives some useful information about it. This information is used.
Introduction to Machine Vision Systems
November 29, 2004AI: Chapter 24: Perception1 Artificial Intelligence Chapter 24: Perception Michael Scherger Department of Computer Science Kent State.
Digital Media Dr. Jim Rowan ITEC 2110 Color. COLOR Is a mess It’s a subjective sensation PRODUCED in the brain Color differs for light and paint/ink Printing.
N.B. Please register for the course with the ITO Please attend a practical in the coming week: –Either 10:00, 13:05 Tuesday –Or 10:00, 13:05 Friday If.
Light and Color. Light is a form of energy light travels extremely fast and over long distances light carries energy and information light travels in.
Spectral contrast enhancement
Abstract Design Considerations and Future Plans In this project we focus on integrating sensors into a small electrical vehicle to enable it to navigate.
1 Intelligent Robotics Research Centre (IRRC) Department of Electrical and Computer Systems Engineering Monash University, Australia Visual Perception.
Sensors. Sensors are for Perception Sensors are physical devices that measure physical quantities. – Such as light, temperature, pressure – Proprioception.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Information Extraction from Cricket Videos Syed Ahsan Ishtiaque Kumar Srijan.
Programming Concepts Part B Ping Hsu. Functions A function is a way to organize the program so that: – frequently used sets of instructions or – a set.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
An Introduction to Robotic Navigation ECE 450 Introduction to Robotics.
Lab #5-6 Follow-Up: More Python; Images Images ● A signal (e.g. sound, temperature infrared sensor reading) is a single (one- dimensional) quantity that.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
LIGHT.
Perception Introduction Pattern Recognition Image Formation
An Introduction to Mobile Robotics CSE350/ Sensor Systems (Continued) 2 Sep 03.
DIGITAL Video. Video Creation Video captures the real world therefore video cannot be created in the same sense that images can be created video must.
CS447/ Realistic Rendering -- Radiosity Methods-- Introduction to 2D and 3D Computer Graphics.
Robotica Lecture 3. 2 Robot Control Robot control is the mean by which the sensing and action of a robot are coordinated The infinitely many possible.
University of Amsterdam Search, Navigate, and Actuate - Qualitative Navigation Arnoud Visser 1 Search, Navigate, and Actuate Qualitative Navigation.
Week 6 - Wednesday.  What did we talk about last time?  Light  Material  Sensors.
Sensing Today: Using Sensors Monday: Quiz on Controls and Sensing Rat Robots Scientists Develop Remote-Controlled Rats "The animal is not only doing something.
Chapter 6 Section 2: Vision. What we See Stimulus is light –Visible light comes from sun, stars, light bulbs, & is reflected off objects –Travels in the.
Autonomous Mobile Robots CPE 470/670 Lecture 6 Instructor: Monica Nicolescu.
CS6825: Color 2 Light and Color Light is electromagnetic radiation Light is electromagnetic radiation Visible light: nm. range Visible light:
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Sounds of Old Technology IB Assessment Statements Topic 14.2., Data Capture and Digital Imaging Using Charge-Coupled Devices (CCDs) Define capacitance.
University of Kurdistan Digital Image Processing (DIP) Lecturer: Kaveh Mollazade, Ph.D. Department of Biosystems Engineering, Faculty of Agriculture,
Autonomous Robots Vision © Manfred Huber 2014.
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
Applying Pixel Values to Digital Images
Intelligent Vision Systems Image Geometry and Acquisition ENT 496 Ms. HEMA C.R. Lecture 2.
Intelligent Robotics Today: Vision & Time & Space Complexity.
Fast SLAM Simultaneous Localization And Mapping using Particle Filter A geometric approach (as opposed to discretization approach)‏ Subhrajit Bhattacharya.
Suggested Machine Learning Class: – learning-supervised-learning--ud675
VIRTUAL KEYBOARD By Parthipan.L Roll. No: 36 1 PONDICHERRY ENGINEERING COLLEGE, PUDUCHERRY.
Physics: light waves. Properties and Sources of Light Key Question: What are some useful properties of light?
1 of 32 Computer Graphics Color. 2 of 32 Basics Of Color elements of color:
Self-Navigation Robot Using 360˚ Sensor Array
Sensors For Robotics Robotics Academy All Rights Reserved.
Chapter 6: Color Image Processing
Introduction to Robots
Sensors For Robotics Robotics Academy All Rights Reserved.
Sensors and Sensing for Reactive Robots
دکتر سعید شیری قیداری & فصل 4 کتاب
Common Classification Tasks
Computer Vision Lecture 4: Color
Probabilistic Robotics
Presentation transcript:

Common Sensing Techniques for Reactive Robots ( ) Sungmin Lee ( 이성민 ) Division of Electronic Engineering, Chonbuk National University Intelligent Systems & Robotics Lab.

Chapter objectives Describe difference between active and passive sensors Describe the types of behavioral sensor fusion Define each of the following terms in one or two sentences: proprioception, exteroception, exproprioception,proximity sensor, logical sensor, false positive, false negative, hue, saturation, computer vision Describe the problems of specular reflection, cross talk, foreshortening, and if given a 2D line drawing of surfaces, illustrate where each of these problems would be likely to occur If given a small interleaved RGB image and a range of color values for a region, be able to 1) threshold on color and 2) construct a color histogram

Contents 1.Logical sensors 2.Behavioral Sensor Fusion 3.Attributes of a sensor 4.Sensor Categories 5.Computer vision 6.Case study 7.Summary

Motivation Sensing is tightly coupled with acting in reactive systems, so need to know about sensors What sensors are out there? –Ultrasonics, cameras are traditional favorites –Sick laser ranger is gaining fast in popularity How would you describe them (attributes)? How would you decide which one to pick and use for an application?

Logical Sensors A unit of sensing or module (supplies a particular percept). It consists of the signal processing and the software processing. Can be easily implemented as a perceptual schema. Different sensors/perceptual schemas can produce the same percept - motor schema doesn’t care! –Behavior can pick what’s available Example: ring of IRs, ring of sonars –If sensor fails, then another can be substituted without deliberation or explicit modeling –Conflicts in allocation can be solved by using logical sensors (deliberation is required to assign)

Active vs. Passive (Example) Active sensors - Sensor emits some form of energy and then measures the impact as a way of understanding the environment - Ex. Ultrasonics, laser Passive sensors - Sensor receives energy already in the environment - Ex. Camera Passive consume less energy, but often signal-noise problems Active often have restricted environments Stereo Camera pair Thermal sensor Laser ranger Sonars Bump sensor

Behavioral Sensor Fusion Sensor fusion is a broad term used for any process that combines information from multiple sensors into a single percept. In some cases multiple sensors are used when a particular sensor is too imprecise or noisy to give reliable data. Adding a second sensor can give another “vote” for the percept. When a sensor leads the robot to believe that a percept is present, but it is not, the error is called a false positive. The robot has made a positive identification of percept, but it was false. Likewise, an error where the robot misses a percept is known as a false negative. False positive False negative

Sensing Model 11 Sensor/TransducerBehaviorAction

Sensing in Reactive Paradigm Each behavior has its own dedicated sensing. One behavior literally does not know what another behavior is doing or perceiving. Behavior

Perceptual Schemas Motor Schemas Behavioral Sensor Fusion: -sensor fission This sensor fission in part as a take off on the connotations of the word “fusion” in nuclear physics. In nuclear fusion, energy is created by forcing atoms and particles together, while in fission, energy is creating by separating atoms and particles.

Perceptual Schema Motor Schema Behavioral Sensor Fusion: -action-oriented sensor fusion This type of sensor fusion is called action-oriented sensor fusion to emphasize that the sensor data is being transformed into a behavior-specific representation in order to support a particular action, not for constructing a world model.

Perceptual Schema Motor Schema Behavioral Sensor Fusion: -sensor fashion Sensor fashion, an alliterative name intended to imply the robot was changing sensors with changing circumstances just as people change styles of clothes with the seasons.

Designing a Sensor Suite -Attributes of a sensor Field of view, range : does it cover the “right” area Accuracy & repeatability : how well does it work? Responsiveness in target domain : how well does it work for this domain? Power consumption : may suck the batteries dry too fast Reliability : can be a bit flakey, vulnerable Size : always a concern! Computational Complexity : can you process it fast enough? Interpretation Reliability : do you believe what it’s saying?

Should be considered for the entire sensing suite : Simplicity Modularity Redundancy - physical redundancy ( there are several instances of physically identical sensors on the robot.) -logical redundancy (another sensor using a different sensing modality can produce the same percept or releaser.) - fault tolerance Designing a Sensor Suite -Attributes of a sensor suite

Sensor Categories Proprioceptive –Inertial Navigation System(INS) –Global Positioning System(GPS) Exteroceptive –Proximity Range Contact –Computer Vision

Proprioceptive Sensors(1) -Inertial navigation system (INS) Measure movements electronically through miniature accelerometers INS can provide accurate dead reckoning to 0.1 percent of the distance traveled. However, this technology is unsuitable for mobile robots for several reasons.(cost, Size, etc) MQ-9 Reaper

GPS systems work by receiving signals from satellites orbiting the Earth. GPS is not complete solutions to the dead reckoning problem in mobile robots. GPS does not work indoors(environmental limit) Proprioceptive Sensors(2) -Global Positioning System (GPS)

Proximity Sensors(1) -Sonar or ultrasonic Sonar refers to any system for using sound to measure range. (use a sonar for underwater vehicles ). Ground vehicles commonly use sonar with an ultrasonic frequency. Ultrasonic sensors generate high frequency sound waves and evaluate the echo which is received back by the sensor. Sensors calculate the time interval between sending the signal and receiving the echo to determine the distance to an object. Ultrasonic is possibly the most common sensor on commercial robots operating Polaroid ultrasonic transducer

chairs, tables– legs, edges too thin for resolution Proximity Sensors(1) - Three problems with sonar range readings foreshortenin g cross-talk specular reflection

Maps produced by a mobile robot using sonars in: a.) a lab and b.) a hallway. (The black line is the path of the robot.) Proximity Sensors(1) - sonar maps labhallway

Power consumption –High Reliability –Lots of problems Size –Size of a Half dollar, board is similar size and can be creatively packaged Computational Complexity –Low; doesn’t give much information Interpretation Reliability –poor Proximity Sensors(1) - Attributes of ultrasonic

Physics : active sensor, works on time of flight Advantages : range, inexpensive ($30 US), small Disadvantages : specular reflection, crosstalk, foreshortening, high power consumption, low resolution Proximity Sensors(1) - Ultrasonic Summary

They emit near-infrared energy and measure whether any significant amount of the IR light is returned. These often fail in practice because the light emitted is often “washed out” by bright ambient lighting or is absorbed by dark materials (i.e., the environment has too much noise). Proximity Sensors(2) - Infrared ray (IR) Sharp GP2Y0A21YK

Popular class of robotic sensing is tactile, or touch, done with bump and feeler sensors. The sensitivity of a bump sensor can be adjusted for different contact pressures Proximity Sensors(3) - Bump and feeler sensors Roomba 500 Bump

Computer Vision - Definition Computer vision refers to processing data from any modality which uses the electromagnetic spectrum which produces an image. face recognition

Computer Vision - Attributes Physics : light reflecting off of surfaces, respond to wavelenght Field of view, range : depends on lens; lens typical have a different VFOV and HFOV (vertical, horizontal) Accuracy & repeatability : good Responsiveness in target domain : depends on lighting source, inherent constrast between objects of interest Power consumption : low Reliability : good Size : can be miniaturized Interpretation Reliability : good

A charge-coupled device (CCD) is a device for the movement of electrical charge, usually from within the device to an area where the charge can be manipulated, for example conversion into a digital value. Computer Vision - CCD cameras CCD sensors typically produce less NOISE. CCD sensors typically are more light-sensitive. CMOS sensors use far less power. CMOS sensor cost less to produce.

RGB (red, green, blue) is the NTSC output –Poor color constancy in “real world” H,S,I (hue, saturation, intensity) has theoretical color constancy –But not with conversion from RGB to HSI Alternatives SCT (Spherical Coordinate Transform) That color space was designed to transform RGB data to a color space that more closely duplicates the response of the human eye. Computer Vision - Color planes Original imageRG B HS I SC T

For reactive applications: –Color segmentation Imprint on a color region, then follow it (or remember it) –Color histogramming Imprint on a region with a distribution of color, then follow it (or remember it) Computer Vision - Common Vision Algorithms

Range from Vision -Stereo camera pairs Using two cameras to extract range data is often referred to as range from stereo, stereo disparity, binocular vision, or just plain “stereo.” One way to extract depth is to try to superimpose a camera over each eye. Each camera finds the same point in each image, turns itself to center that point in the image, then measures the relative angle. The cameras are known as the stereo pair. Ways of extracting depth from a pair of cameras

Light striping, light stripers or structured light detectors work by projecting a colored line (or stripe), grid, or pattern of dots on the environment. Then a regular vision camera observes how the pattern is distorted in the image. Range from Vision -Light stripers

Range from Vision -Laser ranging(Sick) Accuracy & repeatability - Excellent results Responsiveness in target domain Power consumption - High; reduce battery run time by half Reliability - good Size - A bit large Computational Complexity –Not bad until try to “stack up” Interpretation Reliability –Much better than any other ranger flat surface an obstacle a negative obstacle SICK PLS100

180 o plane Advantages: high accuracy, coverage Disadvantages: 2D, resistant to miniaturization, cost ($13,000 US) Range from Vision -Laser Ranger Summary NASA/CMU Nomad robot (Carnegie Mellon University )

Case Study : Hors d’Oeuvres, Anyone? (Borg Shark and Puffer Fish) Camera pair (redundant): Face color Laser range: Count treat removal Sonars: Avoid obstacles, If blocked, Puffed up Digital thermometer: “Face” temperature check Sensor fusion: Reduced False positives, False negatives From 27.5% to 0%

State diagrams for the Borg Shark sonar shaft encoders map: evidence grid waypoint navigation: move to goal, avoid serving food: finding faces, counting treat removal thermal vision laser range sonar awaiting refill: finding faces visionthermal at waypoint OR food removed time limit serving exceeded food depleted full tray

Summary Design of a sensor suite requires careful consideration –Almost all robots will have proprioception, but exteroception needs to be closely matched to the task and the environment Most common exteroceptive sensors on mobile robots are: –Ultrasonics –Computer vision –Laser range Color vision can be hard, almost all vision is computationally expensive unless focus on affordances –Borg shark and Puffer fish with color plus heat –Polly and texture

Thank you