Thrust IIA: Environmental State Estimation and Mapping Dieter Fox (Lead) Nicholas Roy MURI 8 Kickoff Meeting 2007.

Slides:



Advertisements
Similar presentations
Kien A. Hua Division of Computer Science University of Central Florida.
Advertisements

IODetector: A Generic Service for Indoor Outdoor Detection Pengfei Zhou†, Yuanqing Zheng†, Zhenjiang Li†, Mo Li†, and Guobin Shen‡ †Nanyang Technological.
Advanced Mobile Robotics
Computer vision: models, learning and inference Chapter 18 Models for style and identity.
Hidden Markov Models Reading: Russell and Norvig, Chapter 15, Sections
(Includes references to Brian Clipp
Introduction of Probabilistic Reasoning and Bayesian Networks
Data Visualization STAT 890, STAT 442, CM 462
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Dieter Fox University of Washington Department of Computer Science & Engineering.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –One exception: games with multiple moves In particular, the Bayesian.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Autonomous Robot Navigation Panos Trahanias ΗΥ475 Fall 2007.
A Practical Approach to Recognizing Physical Activities Jonathan Lester, Tanzeem Choudhury, and Gaetano Borriello In Proceedings of the Fourth International.
1 Computer Vision Research  Huttenlocher, Zabih –Recognition, stereopsis, restoration, learning  Strong algorithmic focus –Combinatorial optimization.
Part 2 of 3: Bayesian Network and Dynamic Bayesian Network.
A.Kleiner*, N. Behrens** and H. Kenn** Wearable Computing meets MAS: A real-world interface for the RoboCupRescue simulation platform Motivation Wearable.
Computational Vision Jitendra Malik University of California at Berkeley Jitendra Malik University of California at Berkeley.
Christian Siagian Laurent Itti Univ. Southern California, CA, USA
Probabilistic Robotics Bayes Filter Implementations Particle filters.
High Speed Obstacle Avoidance using Monocular Vision and Reinforcement Learning Jeff Michels Ashutosh Saxena Andrew Y. Ng Stanford University ICML 2005.
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Bayesian Filtering for Location Estimation D. Fox, J. Hightower, L. Liao, D. Schulz, and G. Borriello Presented by: Honggang Zhang.
Crash Course on Machine Learning
Bayesian Filtering for Robot Localization
Exercise Session 10 – Image Categorization
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Autonomous Learning of Object Models on Mobile Robots Xiang Li Ph.D. student supervised by Dr. Mohan Sridharan Stochastic Estimation and Autonomous Robotics.
Joint International Master Project Dennis Böck & Dirk C. Aumueller 1.
Markov Localization & Bayes Filtering
Juhana Leiwo – Marco Torti.  Position and movement  Direction of acceleration (gravity) ‏  Proximity and collision sensing  3-dimensional spatial.
Prakash Chockalingam Clemson University Non-Rigid Multi-Modal Object Tracking Using Gaussian Mixture Models Committee Members Dr Stan Birchfield (chair)
/09/dji-phantom-crashes-into- canadian-lake/
From Bayesian Filtering to Particle Filters Dieter Fox University of Washington Joint work with W. Burgard, F. Dellaert, C. Kwok, S. Thrun.
A General Framework for Tracking Multiple People from a Moving Camera
SA-1 Mapping with Known Poses Ch 4.2 and Ch 9. 2 Why Mapping? Learning maps is one of the fundamental problems in mobile robotics Maps allow robots to.
Interactive Discovery and Semantic Labeling of Patterns in Spatial Data Thomas Funkhouser, Adam Finkelstein, David Blei, and Christiane Fellbaum Princeton.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Optimal, Robust Information Fusion in Uncertain.
Probabilistic Robotics: Monte Carlo Localization
Recognizing Activities of Daily Living from Sensor Data Henry Kautz Department of Computer Science University of Rochester.
Probabilistic Robotics Bayes Filter Implementations Gaussian filters.
SA-1 Probabilistic Robotics Probabilistic Sensor Models Beam-based Scan-based Landmarks.
Learning and Inferring Transportation Routines By: Lin Liao, Dieter Fox and Henry Kautz Best Paper award AAAI’04.
MURI: Integrated Fusion, Performance Prediction, and Sensor Management for Automatic Target Exploitation 1 Dynamic Sensor Resource Management for ATE MURI.
UIUC CS 498: Section EA Lecture #21 Reasoning in Artificial Intelligence Professor: Eyal Amir Fall Semester 2011 (Some slides from Kevin Murphy (UBC))
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 6.1: Bayes Filter Jürgen Sturm Technische Universität München.
Behavior Recognition of Autonomous Underwater Vehicles For CS 7631: Multi Robot Systems Michael “Misha” Novitzky School of Interactive Computing Georgia.
Processing Sequential Sensor Data The “John Krumm perspective” Thomas Plötz November 29 th, 2011.
CHAPTER 8 DISCRIMINATIVE CLASSIFIERS HIDDEN MARKOV MODELS.
Michael Isard and Andrew Blake, IJCV 1998 Presented by Wen Li Department of Computer Science & Engineering Texas A&M University.
Conditional Random Fields for ASR Jeremy Morris July 25, 2006.
Training Conditional Random Fields using Virtual Evidence Boosting Lin Liao, Tanzeem Choudhury †, Dieter Fox, and Henry Kautz University of Washington.
Probabilistic reasoning over time Ch. 15, 17. Probabilistic reasoning over time So far, we’ve mostly dealt with episodic environments –Exceptions: games.
Inferring High-Level Behavior from Low-Level Sensors Donald J. Patterson, Lin Liao, Dieter Fox, and Henry Kautz.
CSE-473 Project 2 Monte Carlo Localization. Localization as state estimation.
CSE 473 Ensemble Learning. © CSE AI Faculty 2 Ensemble Learning Sometimes each learning technique yields a different hypothesis (or function) But no perfect.
John Lafferty Andrew McCallum Fernando Pereira
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
CRF Recitation Kevin Tang. Conditional Random Field Definition.
Monte Carlo Localization for Mobile Robots Frank Dellaert 1, Dieter Fox 2, Wolfram Burgard 3, Sebastian Thrun 4 1 Georgia Institute of Technology 2 University.
SA-1 University of Washington Department of Computer Science & Engineering Robotics and State Estimation Lab Dieter Fox Stephen Friedman, Lin Liao, Benson.
SLAM Techniques -Venkata satya jayanth Vuddagiri 1.
Instructor: Eyal Amir Grad TAs: Wen Pu, Yonatan Bisk Undergrad TAs: Sam Johnson, Nikhil Johri CS 440 / ECE 448 Introduction to Artificial Intelligence.
Brief Intro to Machine Learning CS539
Thrust IC: Action Selection in Joint-Human-Robot Teams
State Estimation Probability, Bayes Filtering
Introduction to Robot Mapping
Probabilistic Robotics
A Short Introduction to the Bayes Filter and Related Models
Presentation transcript:

Thrust IIA: Environmental State Estimation and Mapping Dieter Fox (Lead) Nicholas Roy MURI 8 Kickoff Meeting 2007

Task Objective: Human-Centered Maps  Observation: Automatic map-building (SLAM) is solved sufficiently well  Goal: Describe environments by higher-level concepts:  Places (room, hallway, street, walkway, parking lot, …)  Objects (tree, person, building, car, wall, …)  Key challenges:  Estimating concept types is mostly a discrete problem  Complex features and relationships MURI 8 Kickoff Meeting 2007 University of Washington

Existing Technology  Human-centered mapping requires  integration of high-dimensional, continuous features from multi-modal sensor data  reasoning about spatial and temporal relationships  Conditional Random Fields provide extremely flexible probabilistic framework for learning and inference MURI 8 Kickoff Meeting 2007 University of Washington

Conditional Random Fields  Discriminative, undirected graphical model  Introduced for labeling sequence data to overcome weaknesses of Hidden Markov Models [Lafferty-McCallum-Pereira: ICML-01]  Applied successfully to  Natural language processing [McCallum-Li: CoNLL-03], [Roth-Yih: ICML-05]  Computer vision [Kumar-Hebert: NIPS-04], [Quattoni-Collins-Darrel: NIPS-05]  Robotics [Limketkai-Liao-Fox: IJCAI-05], [Douillard-Fox-Ramos: IROS-07] MURI 8 Kickoff Meeting 2007 University of Washington

Conditional Random Fields  Directly models conditional probability p(x|z) (instead of modeling p(z|x) and p(x), and using Bayes rule to infer p(x|z)).  No independence assumption on observations needed! Hidden states x Observations z MURI 8 Kickoff Meeting 2007 University of Washington

Online Object Recognition MURI 8 Kickoff Meeting 2007 [Douillard-Fox-Ramos: IROS-07, ISRR-07]

From Laser Scans to CRFs MURI 8 Kickoff Meeting 2007 Object type of laser beam 1 Shape and appearance Object type of laser beam 2 Object type of laser beam 3 Object type of laser beam 4 Object type of laser beam n Shape and appearance …

Visual Features MURI 8 Kickoff Meeting 2007 steerable pyramid 3-d RGB histogram 3-d HSV histogram

Geometric Features MURI 8 Kickoff Meeting 2007

Temporal Integration MURI 8 Kickoff Meeting 2007 ………… k-2k-1kk+1  Taking past and future scans into account can improve labeling accuracy.  Match consecutive laser scans using ICP.  Associated laser points are connected in CRF.  Can perform online filtering or offline smoothing via BP.

Example Trace: Car vs. Others MURI 8 Kickoff Meeting 2007  Trained on 90 labeled scans  Inference via filtering in CRF

7 Class Example Labeling MURI 8 Kickoff Meeting 2007

7 Class Example Labeling MURI 8 Kickoff Meeting 2007

7 Class Example Labeling MURI 8 Kickoff Meeting 2007

7 Class Example Labeling MURI 8 Kickoff Meeting 2007

7 Class Example Labeling MURI 8 Kickoff Meeting 2007

Proposed Technical Advances  Integrate recognition results into maps  Improve results by leveraging web training data and high level object detectorss  Add object types suited for target scenario  Improve CRF training MURI 8 Kickoff Meeting 2007 University of Washington

Situation Awareness via Wearable Sensors MicrophoneCamera Light sensors 2 GB SD card Indicator LEDs  Records 4 hours of audio, images (1/sec), GPS, and sensor data (accelerometer, barometric pressure, light intensity, gyroscope, magnetometer)

Soldier Activity Recognition  Automatic generation of mission summaries  Motion type (linger, walk, run, drive, …)  Environment (inside, outside building)  Events (conversations, marked via keyword)  Technical challenges  High-dimensional, continuous observations / features  Different data rates: (1 Hz Hz)  Getting labeled training data  Different persons / environments MURI 8 Kickoff Meeting 2007

Activity Model MURI 8 Kickoff Meeting 2007 e t-1 Environment indoor, outdoor, vehicle etet a t-1 atat c t-1 ctct Activity walk, run, stop, up/downstairs, drive, elevator, cover [Subramanya-Raj-Bilmes-Fox: UAI-06, ISRR-06] Sensor data High-dimensional feature vectors

Data Visualization / Summarization MURI 8 Kickoff Meeting 2007 GPS traces Image sequence (currently in car) Timeline of soldier activities

Milestones  Goals:  Real time wearable interface on cell phone  Data sharing among soldiers and robots  Real time display on remote laptop MURI 8 Kickoff Meeting 2007

Milestones  Year 1:  Real time data sharing between wearable sensor platforms  Integration of object recognition into mapping  Year 2:  Real time data sharing between soldiers, robots, and remote laptop  Detection of specific soldier states / activities (moving, incapacitated,...)