Basic Steps 1.Compute the x and y image derivatives 2.Classify each derivative as being caused by either shading or a reflectance change 3.Set derivatives.

Slides:



Advertisements
Similar presentations
Multistage Sampling.
Advertisements

Constraint Satisfaction Problems
© Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems Introduction.
Copyright © 2008 by the McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Managerial Economics, 9e Managerial Economics Thomas Maurice.
Algorithms compared Bicubic Interpolation Mitra's Directional Filter
By D. Fisher Geometric Transformations. Reflection, Rotation, or Translation 1.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Topic 1Topic Q 1Q 6Q 11Q 16Q 21 Q 2Q 7Q 12Q 17Q 22 Q 3Q 8Q 13Q 18Q 23 Q 4Q 9Q 14Q 19Q 24 Q 5Q 10Q 15Q 20Q 25.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
Mean-Field Theory and Its Applications In Computer Vision1 1.
Bayesian Belief Propagation
Discrete time Markov Chain
THERMAL-AWARE BUS-DRIVEN FLOORPLANNING PO-HSUN WU & TSUNG-YI HO Department of Computer Science and Information Engineering, National Cheng Kung University.
Randomized Algorithms Randomized Algorithms CS648 1.
ABC Technology Project
Methods on Measuring Prices Links in the Fish Supply Chain Daniel V. Gordon Department of Economics University of Calgary FAO Workshop Value Chain Tokyo,
5-1 Chapter 5 Theory & Problems of Probability & Statistics Murray R. Spiegel Sampling Theory.
CHAPTER 6 Introduction to Graphing and Statistics Slide 2Copyright 2012, 2008, 2004, 2000 Pearson Education, Inc. 6.1Tables and Pictographs 6.2Bar Graphs.
Computer vision: models, learning and inference
Squares and Square Root WALK. Solve each problem REVIEW:
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
The x- and y-Intercepts
25 seconds left…...
Week 1.
We will resume in: 25 Minutes.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
Dantzig-Wolfe Decomposition
Local Search Jim Little UBC CS 322 – CSP October 3, 2014 Textbook §4.8
CPSC 322, Lecture 14Slide 1 Local Search Computer Science cpsc322, Lecture 14 (Textbook Chpt 4.8) Oct, 5, 2012.
IP, IST, José Bioucas, Probability The mathematical language to quantify uncertainty  Observation mechanism:  Priors:  Parameters Role in inverse.
Lecture 20 Continuous Problems Linear Operators and Their Adjoints.
9. Two Functions of Two Random Variables
Sections 5.1 & 5.2 Inequalities in Two Variables
Basics of Statistical Estimation
1 Sections 5.1 & 5.2 Inequalities in Two Variables After today’s lesson, you will be able to graph linear inequalities in two variables. solve systems.
Classification Classification Examples
Lecture 10 Nonuniqueness and Localized Averages. Syllabus Lecture 01Describing Inverse Problems Lecture 02Probability and Measurement Error, Part 1 Lecture.
Lecture 14 Nonlinear Problems Grid Search and Monte Carlo Methods.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Introduction to Markov Random Fields and Graph Cuts Simon Prince
3D Human Body Pose Estimation from Monocular Video Moin Nabi Computer Vision Group Institute for Research in Fundamental Sciences (IPM)
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Markov Networks.
Graphical models, belief propagation, and Markov random fields 1.
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Thresholding Otsu’s Thresholding Method Threshold Detection Methods Optimal Thresholding Multi-Spectral Thresholding 6.2. Edge-based.
Recovering Intrinsic Images from a Single Image 28/12/05 Dagan Aviv Shadows Removal Seminar.
Learning Low-Level Vision William T. Freeman Egon C. Pasztor Owen T. Carmichael.
Optical flow and Tracking CISC 649/849 Spring 2009 University of Delaware.
Understanding Belief Propagation and its Applications Dan Yuan June 2004.
Computer vision: models, learning and inference
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
Markov Random Fields, Graph Cuts, Belief Propagation
Object Stereo- Joint Stereo Matching and Object Segmentation Computer Vision and Pattern Recognition (CVPR), 2011 IEEE Conference on Michael Bleyer Vienna.
Graphical models, belief propagation, and Markov random fields Bill Freeman, MIT Fredo Durand, MIT March 21, 2005.
Markov Random Fields Probabilistic Models for Images
Markov Random Fields Allows rich probabilistic models for images. But built in a local, modular way. Learn local relationships, get global effects out.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
1 Kernel Machines A relatively new learning methodology (1992) derived from statistical learning theory. Became famous when it gave accuracy comparable.
Ch 6. Markov Random Fields 6.1 ~ 6.3 Adaptive Cooperative Systems, Martin Beckerman, Summarized by H.-W. Lim Biointelligence Laboratory, Seoul National.
10 October, 2007 University of Glasgow 1 EM Algorithm with Markov Chain Monte Carlo Method for Bayesian Image Analysis Kazuyuki Tanaka Graduate School.
Biointelligence Laboratory, Seoul National University
Markov Networks.
Expectation-Maximization & Belief Propagation
Markov Networks.
Presentation transcript:

Basic Steps 1.Compute the x and y image derivatives 2.Classify each derivative as being caused by either shading or a reflectance change 3.Set derivatives with the wrong label to zero. 4.Recover the intrinsic images by finding the least- squares solution of the derivatives. Original x derivative image Classify each derivative (White is reflectance)

Learning the Classifiers Combine multiple classifiers into a strong classifier using AdaBoost (Freund and Schapire) Choose weak classifiers greedily similar to (Tieu and Viola 2000) Train on synthetic images Assume the light direction is from the right Shading Training Set Reflectance Change Training Set

Using Both Color and Gray-Scale Information Results without considering gray-scale

Some Areas of the Image Are Locally Ambiguous Input Shading Reflectance Is the change here better explained as or ?

Propagating Information Can disambiguate areas by propagating information from reliable areas of the image into ambiguous areas of the image

Consider relationship between neighboring derivatives Use Generalized Belief Propagation to infer labels Propagating Information

Setting Compatibilities Set compatibilities according to image contours –All derivatives along a contour should have the same label Derivatives along an image contour strongly influence each other β=β=

Improvements Using Propagation Input Image Reflectance Image With Propagation Reflectance Image Without Propagation

(More Results) Input ImageShading Image Reflectance Image

Summary 13 Belief propagation is a feasible way to do inference in some Markov Random Fields. We showed applications of this approach to a number of low-level vision problems, including super-resolution, motion, and shading/reflectance discrimination. next talk: presentations/bengaluruDeblur.ppt or keynote:presentations/motioninvVenice.key

14 Inference in Markov Random Fields Gibbs sampling, simulated annealing Iterated conditional modes (ICM) Belief propagation Application examples: super-resolution motion analysis shading/reflectance separation Graph cuts Variational methods

15 Gibbs Sampling and Simulated Annealing Gibbs sampling: –A way to generate random samples from a (potentially very complicated) probability distribution. –Fix all dimensions except one. Draw from the resulting 1-d conditional distribution. Repeat for all dimensions, and repeat many times Simulated annealing: –A schedule for modifying the probability distribution so that, at zero temperature, you draw samples only from the MAP solution. Reference: Geman and Geman, IEEE PAMI 1984.

16 Sampling from a 1-d function 1.Discretize the density function 2. Compute distribution function from density function 3. Sampling draw ~ U(0,1); for k = 1 to n if break; ;

17 Gibbs Sampling x1x1 x2x2 Slide by Ce Liu

18 Gibbs sampling and simulated annealing Simulated annealing as you gradually lower the temperature of the probability distribution ultimately giving zero probability to all but the MAP estimate. Whats good about it: finds global MAP solution. Whats bad about it: takes forever. Gibbs sampling is in the inner loop…

19 Gibbs sampling and simulated annealing So you can find the mean value (MMSE estimate) of a variable by doing Gibbs sampling and averaging over the values that come out of your sampler. You can find the MAP value of a variable by doing Gibbs sampling and gradually lowering the temperature parameter to zero.

20 Inference in Markov Random Fields Gibbs sampling, simulated annealing Iterated conditional modes (ICM) Belief propagation Application examples: super-resolution motion analysis shading/reflectance separation Graph cuts Variational methods

21 Iterated conditional modes For each node: –Condition on all the neighbors –Find the mode –Repeat. Compare with Gibbs sampling… Very small region over which its a local maximum Described in: Winkler, Introduced by Besag in 1986.

22 Winkler, 1995

23 Region marginal probabilities i ji

24 Belief propagation equations Belief propagation equations come from the marginalization constraints. jii ji i =