SA-1 University of Washington Department of Computer Science & Engineering Robotics and State Estimation Lab Dieter Fox Stephen Friedman, Lin Liao, Benson.

Slides:



Advertisements
Similar presentations
Bayesian Belief Propagation
Advertisements

Yinyin Yuan and Chang-Tsun Li Computer Science Department
Scaling Up Graphical Model Inference
HOPS: Efficient Region Labeling using Higher Order Proxy Neighborhoods Albert Y. C. Chen 1, Jason J. Corso 1, and Le Wang 2 1 Dept. of Computer Science.
Learning on the Test Data: Leveraging “Unseen” Features Ben Taskar Ming FaiWong Daphne Koller.
Exact Inference in Bayes Nets
Carolina Galleguillos, Brian McFee, Serge Belongie, Gert Lanckriet Computer Science and Engineering Department Electrical and Computer Engineering Department.
Supervised Learning Recap
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Introduction to Belief Propagation and its Generalizations. Max Welling Donald Bren School of Information and Computer and Science University of California.
Conditional Random Fields - A probabilistic graphical model Stefan Mutter Machine Learning Group Conditional Random Fields - A probabilistic graphical.
Conditional Random Fields: Probabilistic Models for Segmenting and Labeling Sequence Data John Lafferty Andrew McCallum Fernando Pereira.
Belief Propagation by Jakob Metzler. Outline Motivation Pearl’s BP Algorithm Turbo Codes Generalized Belief Propagation Free Energies.
Uncertainty Representation. Gaussian Distribution variance Standard deviation.
Markov Networks.
Contextual Classification with Functional Max-Margin Markov Networks Dan MunozDrew Bagnell Nicolas VandapelMartial Hebert.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
A Graphical Model For Simultaneous Partitioning And Labeling Philip Cowans & Martin Szummer AISTATS, Jan 2005 Cambridge.
Lecture 17: Supervised Learning Recap Machine Learning April 6, 2010.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Abstract We present a model of curvilinear grouping using piecewise linear representations of contours and a conditional random field to capture continuity.
Conditional Random Fields
CPSC 422, Lecture 18Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 18 Feb, 25, 2015 Slide Sources Raymond J. Mooney University of.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Cue Integration in Figure/Ground Labeling Xiaofeng Ren, Charless Fowlkes and Jitendra Malik, U.C. Berkeley We present a model of edge and region grouping.
Handwritten Character Recognition using Hidden Markov Models Quantifying the marginal benefit of exploiting correlations between adjacent characters and.
Computer vision: models, learning and inference
Extracting Places and Activities from GPS Traces Using Hierarchical Conditional Random Fields Yong-Joong Kim Dept. of Computer Science Yonsei.
Autonomous Learning of Object Models on Mobile Robots Xiang Li Ph.D. student supervised by Dr. Mohan Sridharan Stochastic Estimation and Autonomous Robotics.
6. Experimental Analysis Visible Boltzmann machine with higher-order potentials: Conditional random field (CRF): Exponential random graph model (ERGM):
Markov Localization & Bayes Filtering
Conditional Topic Random Fields Jun Zhu and Eric P. Xing ICML 2010 Presentation and Discussion by Eric Wang January 12, 2011.
Graphical models for part of speech tagging
Machine Learning in Spoken Language Processing Lecture 21 Spoken Language Processing Prof. Andrew Rosenberg.
A Markov Random Field Model for Term Dependencies Donald Metzler W. Bruce Croft Present by Chia-Hao Lee.
Mapping and Localization with RFID Technology Matthai Philipose, Kenneth P Fishkin, Dieter Fox, Dirk Hahnel, Wolfram Burgard Presenter: Aniket Shah.
Undirected Models: Markov Networks David Page, Fall 2009 CS 731: Advanced Methods in Artificial Intelligence, with Biomedical Applications.
LOGO Ensemble Learning Lecturer: Dr. Bo Yuan
1 Generative and Discriminative Models Jie Tang Department of Computer Science & Technology Tsinghua University 2012.
Markov Random Fields Probabilistic Models for Images
Maximum Entropy (ME) Maximum Entropy Markov Model (MEMM) Conditional Random Field (CRF)
Presented by Jian-Shiun Tzeng 5/7/2009 Conditional Random Fields: An Introduction Hanna M. Wallach University of Pennsylvania CIS Technical Report MS-CIS
Conditional Random Fields for ASR Jeremy Morris July 25, 2006.
Training Conditional Random Fields using Virtual Evidence Boosting Lin Liao, Tanzeem Choudhury †, Dieter Fox, and Henry Kautz University of Washington.
1 Mean Field and Variational Methods finishing off Graphical Models – Carlos Guestrin Carnegie Mellon University November 5 th, 2008 Readings: K&F:
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: Supervised Learning Resources: AG: Conditional Maximum Likelihood DP:
Probabilistic Graphical Models seminar 15/16 ( ) Haim Kaplan Tel Aviv University.
Lecture 2: Statistical learning primer for biologists
Inferring High-Level Behavior from Low-Level Sensors Donald J. Patterson, Lin Liao, Dieter Fox, and Henry Kautz.
CSE 473 Ensemble Learning. © CSE AI Faculty 2 Ensemble Learning Sometimes each learning technique yields a different hypothesis (or function) But no perfect.
John Lafferty Andrew McCallum Fernando Pereira
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
CPSC 422, Lecture 17Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 17 Oct, 19, 2015 Slide Sources D. Koller, Stanford CS - Probabilistic.
Markov Random Fields & Conditional Random Fields
NTU & MSRA Ming-Feng Tsai
Discriminative Training and Machine Learning Approaches Machine Learning Lab, Dept. of CSIE, NCKU Chih-Pin Liao.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 17: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Bayesian Belief Propagation for Image Understanding David Rosenberg.
Thrust IIA: Environmental State Estimation and Mapping Dieter Fox (Lead) Nicholas Roy MURI 8 Kickoff Meeting 2007.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
CSC321: Neural Networks Lecture 19: Boltzmann Machines as Probabilistic Models Geoffrey Hinton.
Machine Learning Basics
Markov Networks.
Shashi Shekhar Weili Wu Sanjay Chawla Ranga Raju Vatsavai
Expectation-Maximization & Belief Propagation
Discriminative Probabilistic Models for Relational Data
Markov Networks.
Sequential Learning with Dependency Nets
Outline Texture modeling - continued Markov Random Field models
Presentation transcript:

SA-1 University of Washington Department of Computer Science & Engineering Robotics and State Estimation Lab Dieter Fox Stephen Friedman, Lin Liao, Benson Limketkai Conditional Random Fields and their Application to Labeling Objects and Places

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 2 Relational Object Maps (RO-Maps) ► Current maps (topological, occupancy, landmark) do not provide object-level descriptions of environments ► Goal: describe environments in terms of objects (doors, walls, furniture, etc.) and places (hallways, rooms, open spaces).

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 3 Relational Object Maps (RO-Maps)

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 4 ► Needed:  Probabilistic models to reason about complex spatial constraints  Techniques to learn parameters of such models Context is Crucial

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 5 Overview ► Conditional Random Fields ► Low-level detection of doors and walls ► High-level place labeling ► Future work

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 6 Conditional Random Fields (CRF) ► Undirected graphical model ► Introduced for labeling sequence data ► No independence assumption on observations! ► Extremely flexible Hidden variables Y Observations X [Lafferty et al.; ICML 2001]

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 7 ► Conditional probability defined via clique potentials (non-negative functions over variable values in cliques of graph) Probabilities in CRFs Hidden variables Y Observations X

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 8 ► Conditional probability defined via clique potentials (non-negative functions over variable values in cliques of graph) ► Partition function Z(x) normalizes probabilities (necessary since potentials are not normalized, as in directed models) Probabilities in CRFs

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 9 ► Typically, potentials defined via log-linear model (linear combination of feature vectors extracted from variable values) ► Thus Log-linear Potential Representation weight vec. feature vec.

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 10 ► Compute conditional probability via local message passing called belief propagation (BP) ► BP is exact if network has no loops (tree) ► Corresponds to smoothing for linear chain CRFs ► General networks: loopy BP (might not converge) ► Can also compute MAP configuration ► Alternative: sample configurations via MCMC Inference in CRFs

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 11 ► Maximize conditional likelihood of labeled data ► Conjugate gradient descent  Compute gradient of log-likelihood wrt. weights  Inference at each maximization step  Optional: Maximize conditional pseudo likelihood ► Typically zero mean shrinkage prior on weights Discriminative Training of CRFs

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 12 Overview ► Conditional Random Fields ► Low-level detection of doors and walls ► High-level place labeling ► Future work

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 13 Relational Object Maps ► Objects: Doors, Wall segments, Other  Built from geometric primitives (line segments)  Can generate more complex objects from existing ones via physical aggregation

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 14 Relational Object Maps ► Objects: Doors, Wall segments, Other  Built from geometric primitives (line segments)  Can generate more complex objects from existing ones via physical aggregation ► Relations  Spatial  Appearance-based

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 15 Inference: Labeling objects ► Gibbs sampling  Assign random label to each line segment  At each MCMC step, update the label of some object by sampling from the conditional distribution. ► When the label of an object k is changed, need to update the cliques and the parameters of objects involving object k.

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 16 Inference: Labeling objects ► Goal: Estimate labels (types) of objects ► Complication: Clique structures change based on label of object physical aggregation

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 17 Experimental Setup ► Maps of five different environments: one of Allen (UW) and four from Radish—Robotics Data Set Repository ► Two to three hallways per environment; line segments labelled by hand ► Five-fold cross-validation (i.e., train on hallways of four environments and test on hallways from fifth environment)

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 18 Sample Maps

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 19 Results: Confusion matrix Truth Inferred labels WallDoorOther Wall 221 (94%) 5 (2%) 8 (4%) Door 1 (1%) 122 (85%) 21 (14%) Other 10 (9%) 12 (10%) 93 (81%)

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 20 Features ► Local  segment length ► Neighborhood  door-door, wall-door ► Spatial  door indentation, alignment with wall ► Global  variance of widths of doors in a hallway

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 21 Typical Results

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 22 Results: Average accuracy rates Environment Features (% accuracy) Local Local + NeighborhoodAll

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 23 Worst Case

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 24 ► Works for individual hallways only ► MCMC is inefficient  Learning: several hours  Labeling: minutes ► Idea: Detect objects conditioned on areas (and vice versa) Shortcomings

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 25 Overview ► Conditional Random Fields ► Low-level detection of doors and walls ► High-level place labeling ► Future work

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 26 Place Labeling ► Goal: Segment environment into places ► Place types: Room, Hallway, Doorway, Junction, Other ► Enables better planning and natural interface between humans and robots

SA-1 Goal Room Corridor Doorway Courtesy of Wolfram Burgard

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 28 Local Approach Using AdaBoost ► Learn to label individual locations ► Extract laser range-features from occupancy map (size of area, difference between laser beams, FFTs, axes of ellipse, … ) ► Learn to classify locations using supervised AdaBoost learning

SA-1 Simple Features gap = d > θ f = # gaps minimum f =area f =perimeter f = d d didi d d Σ diΣ di Courtesy of Wolfram Burgard

SA-1 Combining Features Observation: There are many simple features f i. Problem: Each single feature f i gives poor classification rates. Solution: Combine multiple simple features to form a strong classifier using AdaBoost. Courtesy of Wolfram Burgard

SA-1 Key Idea of AdaBoost observation observation N BOOSTINGBOOSTING w1h1w1h1. w T h T.. Σ Strong binary classifier H using weak hypotheses h j {1,0} θ Courtesy of Wolfram Burgard

SA-1 Courtesy of Wolfram Burgard Example Experiment Training (top) # examples: Test (bottom) # examples: classification: 93.94% Building 079 Univ. of Freiburg Room Corridor Doorway

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 33 Voronoi Random Fields ► Local approach does not take neighborhood relation between locations into account ► Neighboorhood defined via Voronoi Graph ► Idea: Label points on Voronoi Graph using CRF

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 34 Voronoi Random Fields

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 35 Features ► Spatial:  Scan-based [Martinez-Mozos et al. 04]  Voronoi graph-based ► Connectivity via Voronoi graph:  Type of neighbors  Number of neighbors  Size of loop

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 36 Learning ► Learn decision stumps using AdaBoost ► Feed decision stumps as binary features into CRF ► Learn weights using pseudo-likelihood in CRF

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 37 Maps

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 38 Maps

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 39 Maps

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 40 Maps

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 41 Experimental Results ► Leave one out cross validation on 4 maps ► Accuracy: Percentage correctly labeled

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 42 Place Labels Induced by VRF

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 43 Topological Map Induced by VRF

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 44 VRF AdaBoost

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 45 VRF AdaBoost

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 46 Experimental Results: Edit Distance ► Consistency:  Pick pair of points  Compute shortest path  Compare place sequence to ground truth using edit-distance

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 47 Conclusions ► First steps toward object / place maps ► CRFs provide powerful and flexible framework for learning and inference ► Relational Markov networks provide language for reasoning about objects and CRF structures

SSS-06 CRFs for Mapping Places and Objects Dieter Fox UW Robotics & State Estimation Lab 48 Next Steps ► Joint place labeling and object detection ► Combine low-level and high-level CRFs  k-best style inference to find places  Label objects conditioned on places  Re-evaluate place hypotheses ► Use visual features ► Joint feature and CRF learning