CNN Headline: “Police: 8-year-old shoots, kills elderly caregiver after playing video game.” 1 st sentence: “An 8-year-old Louisiana boy intentionally.

Slides:



Advertisements
Similar presentations
It is very difficult to measure the small change in volume of the mercury. If the mercury had the shape of a sphere, the change in diameter would be very.
Advertisements

Feedback Control Weapons ON Target
Feedback Control Dynamically or actively command, direct, or regulate themselves or other systems.
Advanced Mobile Robotics
Sonar and Localization LMICSE Workshop June , 2005 Alma College.
CS 376b Introduction to Computer Vision 04 / 21 / 2008 Instructor: Michael Eckmann.
Laser ​ Distance measurement ​ by Aleksi Väisänen ​ Pauli Korhonen.
Rotary Encoder. Wikipedia- Definition  A rotary encoder, also called a shaft encoder, is an electro- mechanical device that converts the angular position.
(Includes references to Brian Clipp
Unit 4 Sensors and Actuators
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
December 5, 2013Computer Vision Lecture 20: Hidden Markov Models/Depth 1 Stereo Vision Due to the limited resolution of images, increasing the baseline.
Attitude Determination and Control
1 Panoramic University of Amsterdam Informatics Institute.
1 of 25 1 of 22 Blind-Spot Experiment Draw an image similar to that below on a piece of paper (the dot and cross are about 6 inches apart) Close your right.
1 CMPUT 412 Sensing Csaba Szepesvári University of Alberta TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AA A A A A.
Motion Tracking. Image Processing and Computer Vision: 82 Introduction Finding how objects have moved in an image sequence Movement in space Movement.
Autonomous Mobile Robots, Chapter 4 © R. Siegwart, I. Nourbakhsh with Skubic augmentations Perception l Sensors l Uncertainty l Features 4 PerceptionMotion.
EIGHTH GRADE ROBOTICS KITTATINNY REGIONAL HIGH SCHOOL MR. SHEA Introduction to Robotics Day4.
1/22/04© University of Wisconsin, CS559 Spring 2004 Last Time Course introduction Image basics.
Perception 4 Sensors Uncertainty Features Localization Cognition
Introduce about sensor using in Robot NAO Department: FTI-FHO-FPT Presenter: Vu Hoang Dung.
Range Sensors (time of flight) (1)
Mohammed Rizwan Adil, Chidambaram Alagappan., and Swathi Dumpala Basaveswara.
2050 RoboCup Small Sizehttp://
ISAT 303-Lab3-1  Measurement of Condition: Lab #3 (2005):  List of parameters of condition: –Linear distance, angular displacement, vibration, displacement,
Smartphone Overview iPhone 4 By Anthony Poland 6 Nov 2014.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 3.2: Sensors Jürgen Sturm Technische Universität München.
Localisation & Navigation
Sensing self motion Key points: Why robots need self-sensing Sensors for proprioception in biological systems in robot systems Position sensing Velocity.
An Introduction to Robotic Navigation ECE 450 Introduction to Robotics.
Navi Rutgers University 2012 Design Presentation
Perception Introduction Pattern Recognition Image Formation
An Introduction to Mobile Robotics CSE350/ Sensor Systems (Continued) 2 Sep 03.
Sensing for Robotics & Control – Remote Sensors R. R. Lindeke, Ph.D.
December 4, 2014Computer Vision Lecture 22: Depth 1 Stereo Vision Comparing the similar triangles PMC l and p l LC l, we get: Similarly, for PNC r and.
INS: Inertial Navigation Systems An overview of 4 sensors.
Sensing Today: Using Sensors Monday: Quiz on Controls and Sensing Rat Robots Scientists Develop Remote-Controlled Rats "The animal is not only doing something.
7 elements of remote sensing process 1.Energy Source (A) 2.Radiation & Atmosphere (B) 3.Interaction with Targets (C) 4.Recording of Energy by Sensor (D)
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
Sounds of Old Technology IB Assessment Statements Topic 14.2., Data Capture and Digital Imaging Using Charge-Coupled Devices (CCDs) Define capacitance.
Lecture 23 Dimitar Stefanov. Wheelchair kinematics Recapping Rolling wheels Instantaneous Centre of Curvature (ICC) motion must be consistent Nonholonomic.
Autonomous Robots Vision © Manfred Huber 2014.
EE 495 Modern Navigation Systems Inertial Sensors Monday, Feb 09 EE 495 Modern Navigation Systems Slide 1 of 19.
COMP 417 – Jan 12 th, 2006 Guest Lecturer: David Meger Topic: Camera Networks for Robot Localization.
EE 495 Modern Navigation Systems
RGB, HSL, HSV HSL, HSV: easier to define, closer to human vision
Suggested Machine Learning Class: – learning-supervised-learning--ud675
Navigation NAU 102 Lesson 17. Interpolation Much of navigation uses tables. e.g. What is the deviation when heading 300°M? Ans: 3°E DEVIATION TABLE MAG.
Navigation NAU 102 Lesson 17. Interpolation Much of navigation uses tables. e.g. What is the deviation when heading 300°M? Ans: 3°E DEVIATION TABLE MAG.
Sensor for Mobile Robots many slides from Siegwart, Nourbaksh, and Scaramuzza, Chapter 4 and Jana Kosecka.
The sensors guide fingerprint sensors rate monitor gyroscope camera
Autonomous Mobile Robots Autonomous Systems Lab Zürich Probabilistic Map Based Localization "Position" Global Map PerceptionMotion Control Cognition Real.
Speed, Power, Torque & DC Motors
Sensors Fusion for Mobile Robotics localization
Sensors For Robotics Robotics Academy All Rights Reserved.
Paper – Stephen Se, David Lowe, Jim Little
+ SLAM with SIFT Se, Lowe, and Little Presented by Matt Loper
Sensors For Robotics Robotics Academy All Rights Reserved.
دکتر سعید شیری قیداری & فصل 4 کتاب
دکتر سعید شیری قیداری & فصل 4 کتاب
Perception 4 Sensors Uncertainty Features Localization Cognition
دکتر سعید شیری قیداری & فصل 4 کتاب
Autonomous Cyber-Physical Systems: Sensing
Common Classification Tasks
Day 32 Range Sensor Models 11/13/2018.
Range Imaging Through Triangulation
Probabilistic Robotics
Distance Sensor Models
Probabilistic Map Based Localization
Presentation transcript:

CNN Headline: “Police: 8-year-old shoots, kills elderly caregiver after playing video game.” 1 st sentence: “An 8-year-old Louisiana boy intentionally shot and killed his elderly caregiver after playing a vilent video game, authorities say.” First, how did the kid get a gun and load it without the caregiver noticing? Second, why was an 8 year old playing GTA?

Reminders Lab 1 due at 11:59pm TODAY No class on Thursday – But Dana 3 will be open Piazza (sorry, not pizza) Lab 2: ARDrone – Charge the batteries – Don’t leave charging – When flying, have someone “spot” for you

Last Time Reactive vs. Deliberative Architecture Temporal vs. Control Decomposition Serial vs. Parallel Decision making Subsumption Architecture Quick ROS overview How would I find the red ball?

Color Tracking Sensors Motion estimation of ball and robot for soccer playing using color tracking 4.1.8

How many black spots?

Perception Sensors Uncertainty Features 4 PerceptionMotion Control Cognition Real World Environment Localization Path Environment Model Local Map "Position" Global Map

Vision-based Sensors: Hardware CCD (light-sensitive, discharging capacitors of 5 to 25 micron) – Charge-coupled device, 1969 at AT&T Bell Labs CMOS (Complementary Metal Oxide Semiconductor technology) sensor – Active pixel sensor – Cheaper, lower power, traditionally lower quality 4.1.8

Blob (color) detection – Lab 2 Edge Detection – SIFT (Scale Invariant Feature Transform) features – invariant-feature-transform/ invariant-feature-transform/ Deep Learning – –

Depth from Focus (1) 4.1.8

Stereo Vision Idealized camera geometry for stereo vision – Disparity between two images -> Computing of depth – From the figure it can be seen that 4.1.8

Stereo Vision 1.Distance is inversely proportional to disparity – closer objects can be measured more accurately 2.Disparity is proportional to b, horizontal distance between lenses – For a given disparity error, the accuracy of the depth estimate increases with increasing baseline b. – However, as b is increased, some objects may appear in one camera, but not in the other. 3.A point visible from both cameras produces a conjugate pair. – Conjugate pairs lie on epipolar line (parallel to the x-axis for the arrangement in the figure above) 4.1.8

Stereo Vision Example Extracting depth information from a stereo image – a1 and a2: left and right image – b1 and b2: vertical edge filtered left and right image; filter = [ ] – c: confidence image: bright = high confidence (good texture) – d: depth image: bright = close; dark = far Artificial example: a bunch of fenceposts? 4.1.8

Adaptive Human-Motion Tracking 4.1.8

Intermission N5LM N5LM

Classification of Sensors Proprioceptive sensors – measure values internally to the system (robot), – e.g. motor speed, wheel load, heading of the robot, battery status Exteroceptive sensors – information from the robots environment – distances to objects, intensity of the ambient light, unique features. Passive sensors – energy coming for the environment Active sensors – emit their proper energy and measure the reaction – better performance, but some influence on envrionment 4.1.1

General Classification (1) 4.1.1

General Classification (2) 4.1.1

Characterizing Sensor Performance (1) Measurement in real world environment is error prone Basic sensor response ratings – Dynamic range ratio between lower and upper limits, usually in decibels (dB, power) e.g. power measurement from 1 Milliwatt to 20 Watts e.g. voltage measurement from 1 Millivolt to 20 Volt 20 instead of 10 because square of voltage is equal to power!! – Range upper limit 4.1.2

Characterizing Sensor Performance (2) Basic sensor response ratings (cont.) – Resolution minimum difference between two values usually: lower limit of dynamic range = resolution for digital sensors it is usually the analog-to-digital conversion – e.g. 5V / 255 (8 bit) – Linearity variation of output signal as function of the input signal linearity is less important when signal is after treated with a computer – Bandwidth or Frequency the speed with which a sensor can provide a stream of readings usually there is an upper limit depending on the sensor and the sampling rate Lower limit is also possible, e.g. acceleration sensor 4.1.2

In Situ Sensor Performance (1) Characteristics that are especially relevant for real world environments Sensitivity – ratio of output change to input change – however, in real world environment, the sensor has very often high sensitivity to other environmental changes, e.g. illumination Cross-sensitivity – sensitivity to environmental parameters that are orthogonal to the target parameters (e.g., compass responding to building materials) Error / Accuracy – difference between the sensor ’ s output and the true value m = measured value v = true value error 4.1.2

In Situ Sensor Performance (2) Characteristics that are especially relevant for real world environments Systematic error -> deterministic errors – caused by factors that can (in theory) be modeled -> prediction Random error -> non-deterministic – no prediction possible – however, they can be described probabilistically Precision – reproducibility of sensor results 4.1.2

Characterizing Error: The Challenges in Mobile Robotics Mobile Robot has to perceive, analyze and interpret the state of the surrounding Measurements in real world environment are dynamically changing and error prone. Examples: – changing illuminations – specular reflections – light or sound absorbing surfaces – cross-sensitivity of robot sensor to robot pose and robot- environment dynamics rarely possible to model -> appear as random errors systematic errors and random errors might be well defined in controlled environment. This is not the case for mobile robots !! 4.1.2

Multi-Modal Error Distributions: The Challenges in … Behavior of sensors modeled by probability distribution (random errors) – usually very little knowledge about the causes of random errors – often probability distribution is assumed to be symmetric or even Gaussian – however, it is important to realize how wrong this can be! – Examples: Sonar (ultrasonic) sensor might overestimate the distance in real environment and is therefore not symmetric Thus the sonar sensor might be best modeled by two modes: 1. the case that the signal returns directly 2. the case that the signals returns after multi-path reflections Stereo vision system might correlate to images incorrectly, thus causing results that make no sense at all 4.1.2

Wheel / Motor Encoders (1) measure position or speed of the wheels or steering wheel movements can be integrated to get an estimate of the robots position -> odometry optical encoders are proprioceptive sensors – position estimation in relation to a fixed reference frame is only valuable for short movements. typical resolutions: 2000 increments per revolution. – for high resolution: interpolation 4.1.3

Wheel / Motor Encoders (2) scannin g reticle fields scal e slits Notice what happen when the direction changes:

Heading Sensors Heading sensors can be proprioceptive (gyroscope, inclinometer) or exteroceptive (compass). Used to determine the robots orientation and inclination. Allow, together with an appropriate velocity information, to integrate the movement to an position estimate. – This procedure is called dead reckoning (ship navigation) 4.1.4

Compass Since over 2000 B.C. – when Chinese suspended a piece of naturally magnetite from a silk thread and used it to guide a chariot over land. Magnetic field on earth – absolute measure for orientation. Large variety of solutions to measure the earth magnetic field – mechanical magnetic compass – direct measure of the magnetic field (Hall-effect, magnetoresistive sensors) Major drawback – weakness of the earth field – easily disturbed by magnetic objects or other sources – not feasible for indoor environments 4.1.4

Gyroscope Heading sensors, that keep the orientation to a fixed frame – absolute measure for the heading of a mobile system. Two categories, the mechanical and the optical gyroscopes – Mechanical Gyroscopes Standard gyro Rated gyro – Optical Gyroscopes Rated gyro 4.1.4

Mechanical Gyroscopes Concept: inertial properties of a fast spinning rotor – gyroscopic precession Angular momentum associated with a spinning wheel keeps the axis of the gyroscope inertially stable. Reactive torque tao (tracking stability) is proportional to the spinning speed w, the precession speed W and the wheels inertia I. No torque can be transmitted from the outer pivot to the wheel axis – spinning axis will therefore be space-stable Quality: 0.1° in 6 hours If the spinning axis is aligned with the north-south meridian, the earth ’ s rotation has no effect on the gyro ’ s horizontal axis If it points east-west, the horizontal axis reads the earth rotation 4.1.4

Rate gyros Same basic arrangement shown as regular mechanical gyros But: gimble(s) are restrained by a torsional spring – enables to measure angular speeds instead of the orientation. Others, more simple gyroscopes, use Coriolis forces to measure changes in heading

Optical Gyroscopes First commercial use started only in the early 1980 when they where first installed in airplanes. Optical gyroscopes – angular speed (heading) sensors using two monochromic light (or laser) beams from the same source. On is traveling in a fiber clockwise, the other counterclockwise around a cylinder Laser beam traveling in direction of rotation – slightly shorter path -> shows a higher frequency – difference in frequency  f of the two beams is proportional to the angular velocity  of the cylinder New solid-state optical gyroscopes based on the same principle are build using microfabrication technology. MUCH more accurate than mechanical 4.1.4

Ground-Based Active and Passive Beacons Elegant way to solve the localization problem in mobile robotics Beacons are signaling guiding devices with a precisely known position Beacon base navigation is used since the humans started to travel – Natural beacons (landmarks) like stars, mountains or the sun – Artificial beacons like lighthouses The recently introduced Global Positioning System (GPS) revolutionized modern navigation technology – Already one of the key sensors for outdoor mobile robotics – For indoor robots GPS is not applicable, Major drawback with the use of beacons in indoor: – Beacons require changes in the environment -> costly. – Limit flexibility and adaptability to changing environments Key design choice in Robocup –