N.B. Please register for the course with the ITO Please attend a practical in the coming week: –Either 10:00, 13:05 Tuesday –Or 10:00, 13:05 Friday If.

Slides:



Advertisements
Similar presentations
Chapter 1 Perception. Ecological Approach to Perception James Gibson 1966, Perception is in “tune” with properties of the enviornment that are.
Advertisements

Sonar and Localization LMICSE Workshop June , 2005 Alma College.
INTRODUCTION TO ROBOTICS INTRODUCTION TO ROBOTICS Part 4: Sensors Robotics and Automation Copyright © Texas Education Agency, All rights reserved.
Part (2) Signal Generators & Sensors Lecture 5 د. باسم ممدوح الحلوانى.
Chapter 17 Mechanical Waves and Sound
Laser ​ Distance measurement ​ by Aleksi Väisänen ​ Pauli Korhonen.
SC. 7. P – Electromagnetic spectrum & sc. 7. p. 10
Resolution Resolving power Measuring of the ability of a sensor to distinguish between signals that are spatially near or spectrally similar.
SOUND AND ULTRASOUND IN MEDICINE Prof. Dr. Moustafa. M. Mohamed Vice Dean Faculty of Allied Medical Science Pharos University Alexandria Dr. Yasser Khedr.
Sensors For Robotics Robotics Academy All Rights Reserved.
Last time we saw: DC motors – inefficiencies, operating voltage and current, stall voltage and current and torque – current and work of a motor Gearing.
Section 25.1 – Properties of Light pp
Remote sensing in meteorology
Course Business Extra Credit available through Psych Subject Pool (and occasionally other experiments) –Up to 2 extra points (i.e. two experiments) –Some.
Sonar Chapter 9. History Sound Navigation And Ranging (SONAR) developed during WW II –Sound pulses emitted reflected off metal objects with characteristic.
Ultrasound Medical Imaging Imaging Science Fundamentals.
First, some philosophy I see, I hear, I feel… Who is I - Do you mean my brain sees, hears, and feels or do you mean something else?
Sensation and Perception psyc 320 Takashi Yamauchi © Takashi Yamauchi (Dept. of Psychology, Texas A&M University) Ch 1.
December 2, 2014Computer Vision Lecture 21: Image Understanding 1 Today’s topic is.. Image Understanding.
PROGRAMMABLE LOGIC CONTROLLER (PLC) AND AUTOMATION
ISAT 303-Lab3-1  Measurement of Condition: Lab #3 (2005):  List of parameters of condition: –Linear distance, angular displacement, vibration, displacement,
Introduction to Robotics and ASU Robots Yinong Chen (Ph.D.) School of Computing, Informatics, and Decision Systems Engineering.
Sensation and Perception
Chapter 9 Electromagnetic Waves. 9.2 ELECTROMAGNETIC WAVES.
Localisation & Navigation
Sensors. Sensors are for Perception Sensors are physical devices that measure physical quantities. – Such as light, temperature, pressure – Proprioception.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
LIGHT.
Classification. An Example (from Pattern Classification by Duda & Hart & Stork – Second Edition, 2001)
15.1 Properties of Sound  If you could see atoms, the difference between high and low pressure is not as great.  The image below is exaggerated to show.
Sound Overview The Facts of Sound The Ear and Sound Sound Vocabulary Musical Instruments and Sound.
Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D.
Sensory Systems: Transducing the Environment ~50 “senses”! (5 “modalities”) 1. Light (G-coupled photoreceptors) 2. Electromagnetic Fields (VG channels)
Chapter 12 Sound.
Forward-Scan Sonar Tomographic Reconstruction PHD Filter Multiple Target Tracking Bayesian Multiple Target Tracking in Forward Scan Sonar.
By Rhett Price. Speed of sound depends on density Correlation between speed and density Not as fast as light speed Also affected by pressure and temperature.
1 Introduction to Atomic Spectroscopy Lecture 10.
INTRODUCTION TO ROBOTICS INTRODUCTION TO ROBOTICS Part 4: Sensors Robotics and Automation Copyright © Texas Education Agency, All rights reserved.
Immersive Displays The other senses…. 1962… Classic Human Sensory Systems Sight (Visual) Hearing (Aural) Touch (Tactile) Smell (Olfactory) Taste (Gustatory)
Course Business Exam schedule confirmed: –Midterm 1: Feb 6 – 9 –Midterm 2: March 5 – 8 –Final Exam: April
Overivew Occupancy Grids -Sonar Models -Bayesian Updating
Sound. Characteristics of Sound Intensity of Sound: Decibels The Ear and Its Response; Loudness Interference of Sound Waves; Beats Doppler Effect Topics.
Colour and Texture. Extract 3-D information Using Vision Extract 3-D information for performing certain tasks such as manipulation, navigation, and recognition.
Marine communication technology BY: JOSHUA DENOBREGA.
Sound Reception Types of ears Extraction of information –Direction –Frequency –Amplitude Comparative survey of animal ears.
Electro-optical systems Sensor Resolution
Sensors and Actuators. Whiskers Whiskers: uses the frequency of vibration to estimate distance to obstacle.
3D Perception and Environment Map Generation for Humanoid Robot Navigation A DISCUSSION OF: -BY ANGELA FILLEY.
Sensors For Robotics Robotics Academy All Rights Reserved.
Ultrasound.
Sensors May 25, 2013.
Introduction to Robots
Robotics Sensors and Vision
Sensors For Robotics Robotics Academy All Rights Reserved.
How Does an Ultrasonic Sensor Work?
INPUT-OUTPUT CHANNELS
How do we realize design? What should we consider? Technical Visual Interaction Search Context of use Information Interacting/ transacting.
Senses and Perception.
Day 32 Range Sensor Models 11/13/2018.
C-15 Sound Physics 1.
Introduction to Sensation and Perception
Distance Sensor Models
ཚོར་སྣང་། ང་ཚོའི་ཚོར་སྣང་གིས། འཁོར་ཡུག་ནང་ག་རེ་ཡོད་དམ། ཇི་ཙམ་ཡོད་དམ།
Remote sensing in meteorology
Experiencing the World
Acoustic Holography Sean Douglass.
Course book. Goldstein. Sensation and Perception
Presentation transcript:

N.B. Please register for the course with the ITO Please attend a practical in the coming week: –Either 10:00, 13:05 Tuesday –Or 10:00, 13:05 Friday If you cannot attend at these times, please see me after the lecture.

Sensing the world Keypoints: Why robots need sensing Factors that affect sensing capability Contact sensing Proximity and range sensing Sensing light

Why robots need sensing For a robot to act successfully in the real world it needs to be able to perceive the world, and itself in the world. Can consider sensing tasks in two broad classes: – Finding out what is out there: e.g. is there a goal; is this a team-mate; is there danger? = Recognition – Finding out where things are: e.g. where is the ball and how can I get to it; where is the cliff-edge and how can I avoid it? = Location But note that this need not be explicit knowledge

Sensing capability depends on a number of factors: 1 What signals are available? Light Pressure & Sound Chemicals

N.B. Many more signals in world than humans usually sense: e.g. Electric fish generate electric field and detect distortion

Sensing capability depends on a number of factors: 1 What signals are available? 2 What are the capabilities of the sensors? Distance Vision Hearing Smell Contact Taste Pressure Temperature Internal Balance Actuator position/ movement Pain/damage

Note this differs across animals: e.g. Bees see ultraviolet light Need to choose what to build in to robot – options and costs Visible Ultraviolet More like a target?

Sensors perform transduction Transduction: transformation of energy from one form to another (typically, into electrical signals)

Sensors perform transduction Sensor characteristics mean there is rarely an isomorphic mapping between the environment and the internal signal, e.g: - Most transducers have a limited range - Most transducers have a limited resolution, accuracy, and repeatability - Most transducers have lags or sampling delays - Many transducers have a non-linear response - Biological transducers are often adaptive - Good sensors are usually expensive in cost, power, size…

Sensing capability depends on a number of factors: 1 What signals are available? 2 What are the capabilities of the sensors? 3 What processing is taking place? E.g. extracting useful information from a sound signal is difficult:

Sound sources cause air vibration Diaphragm (ear drum or microphone) has complex pattern of vibration in response to sound Usually analysed by separating frequencies and grouping through harmonic/temporal cues time frequency

Sensing capability depends on a number of factors: 1 What signals are available? 2 What are the capabilities of our sensors? 3 What processing is taking place? 4 What is the task?

Trans- duction Motor commands Plan of Action Decision on Action Task Internal modelProcessingActuators ‘Classical’ view Alternative view Task specific transduction Task specific processing Task specific action Task 2 specific transduction Task 2 specific processing Task 2 specific action Task 3 specific transduction Task 3 specific processing Task 3 specific action

Affordances “Perceivable potentialities of the environment for an action” Scan scene, build surface model, analyse surfaces, find flat one near feet vs Use “flat surface near feet” special detector First is traditional, second affordance-based IE: sensors tuned for exactly what is needed for the task

Keep close count of how many times the white team pass their ball (Please remain silent till the end of the video clip)

Simons & Chabris (1999) – only 50% of subjects see Using “Count white team passes” affordance rather than complete analysis

Contact sensors Principal function is location –E.g. bump switch or pressure sensor: is the object contacting this part of the robot?

Contact sensors Principal function is location –E.g. bump switch or pressure sensor: is the object contacting this part of the robot? –Antennae: extend the range with flexible element

Contact sensors Can also use for recognition e.g. –Is it moving or are you? –Human touch can distinguish shape, force, slip, surface texture… –Rat whiskers used to distinguish textures

‘Contact’ sensors Note these kinds of sensors can also be used to detect flow e.g. wind sensors

Proximity and Range Sensors Again main function is position: distance to object at specific angle to robot Typically works by emitting signal and detecting reflection Short-range = proximity sensor, e.g. IR

Proximity and Range Sensors Over longer distance = range sensors e.g. Sonar: emit sound and detect reflection

a.Sonar reflection time gives range b.Can only resolve objects of beam width c.Apparent range shorter than axial range d.Angle too large so wall invisible e.Invisible corner f.False reflection makes apparent range greater

Using sonar to construct an occupancy grid Robot wants to know about free space Map space as grid Each element has a value which is the probability it contains an obstacle Update probability estimates from sonar readings

Learning the map Assuming robot knows where it is in grid, sensory input provides noisy information about obstacles, e.g. for sonar Probability p(z|O) of grid element z=(r, ) in region I if occupied (O) given measurement s Using Bayesian approach where p(O) will depend on previous measurements R β αr IIIIII s

Sample occupancy grid Noisy fusion of multiple sonar observations

Proximity and Range Sensors More accurate information (same principle) from laser range finder Either planar or scanning 1,000,000 pixels per second Range of 30 metres Accuracy of few mms

Sample Laser Scan

Light sensors Why is it so useful to detect light? Straight lines mean the rays reflected from objects can be used to form an image, giving you ‘where’. Very short wavelengths gives detailed structural information (including reflectance properties of surface, seen as colour) to determine ‘what’. Very fast, it is especially useful over large distances. But requires half our brain to do vision…

Conclusions Robots need sensing: location, objects, obstacles Commonly used sensors: laser range, sonar, contact, proprioceptic, GPS (outdoors), markers General scene scanning vs affordances