Mean-Field Theory and Its Applications In Computer Vision2

Slides:



Advertisements
Similar presentations
TWO STEP EQUATIONS 1. SOLVE FOR X 2. DO THE ADDITION STEP FIRST
Advertisements

Slide 1 Insert your own content. Slide 2 Insert your own content.
1 Copyright © 2013 Elsevier Inc. All rights reserved. Chapter 28.
Combining Like Terms. Only combine terms that are exactly the same!! Whats the same mean? –If numbers have a variable, then you can combine only ones.
Graph of a Curve Continuity This curve is _____________These curves are _____________ Smoothness This curve is _____________These curves are _____________.
Jeopardy Q 1 Q 6 Q 11 Q 16 Q 21 Q 2 Q 7 Q 12 Q 17 Q 22 Q 3 Q 8 Q 13
0 - 0.
2 pt 3 pt 4 pt 5pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2pt 3 pt 4pt 5 pt 1pt 2pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4pt 5 pt 1pt Simplify All mixed up Misc. AddingSubtract.
2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt 2 pt 3 pt 4 pt 5 pt 1 pt Time Money AdditionSubtraction.
DIVIDING INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
MULTIPLYING MONOMIALS TIMES POLYNOMIALS (DISTRIBUTIVE PROPERTY)
ADDING INTEGERS 1. POS. + POS. = POS. 2. NEG. + NEG. = NEG. 3. POS. + NEG. OR NEG. + POS. SUBTRACT TAKE SIGN OF BIGGER ABSOLUTE VALUE.
MULTIPLICATION EQUATIONS 1. SOLVE FOR X 3. WHAT EVER YOU DO TO ONE SIDE YOU HAVE TO DO TO THE OTHER 2. DIVIDE BY THE NUMBER IN FRONT OF THE VARIABLE.
SUBTRACTING INTEGERS 1. CHANGE THE SUBTRACTION SIGN TO ADDITION
MULT. INTEGERS 1. IF THE SIGNS ARE THE SAME THE ANSWER IS POSITIVE 2. IF THE SIGNS ARE DIFFERENT THE ANSWER IS NEGATIVE.
FACTORING Think Distributive property backwards Work down, Show all steps ax + ay = a(x + y)
Teacher Name Class / Subject Date A:B: Write an answer here #1 Write your question Here C:D: Write an answer here.
Addition Facts
CS4026 Formal Models of Computation Running Haskell Programs – power.
Mean-Field Theory and Its Applications In Computer Vision5 1.
Mean-Field Theory and Its Applications In Computer Vision1 1.
Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr
Mean-Field Theory and Its Applications In Computer Vision3 1.
1 Vibhav Vineet, Jonathan Warrell, Paul Sturgess, Philip H.S. Torr Improved Initialisation and Gaussian Mixture Pairwise Terms for Dense Random Fields.
Mean-Field Theory and Its Applications In Computer Vision6 1.
Bayesian Belief Propagation
Mean-Field Theory and Its Applications In Computer Vision4 1.
Multi-view Stereo via Volumetric Graph-cuts George Vogiatzis, Philip H. S. Torr Roberto Cipolla.
Welcome to Who Wants to be a Millionaire
Welcome to Who Wants to be a Millionaire
Year 6/7 mental test 5 second questions
Photo Composition Study Guide Label each photo with the category that applies to that image.
HOPS: Efficient Region Labeling using Higher Order Proxy Neighborhoods Albert Y. C. Chen 1, Jason J. Corso 1, and Le Wang 2 1 Dept. of Computer Science.
Objectives Upon the completion of this topic the student will be able to explain Reproduction of Motion picture TV picture C-O5/SDE-TV of 90.
O X Click on Number next to person for a question.
© S Haughton more than 3?
5.9 + = 10 a)3.6 b)4.1 c)5.3 Question 1: Good Answer!! Well Done!! = 10 Question 1:
1 Directed Depth First Search Adjacency Lists A: F G B: A H C: A D D: C F E: C D G F: E: G: : H: B: I: H: F A B C G D E H I.
Take from Ten First Subtraction Strategy -9 Click on a number below to go directly to that type of subtraction problems
Squares and Square Root WALK. Solve each problem REVIEW:
Linear Time Methods for Propagating Beliefs Min Convolution, Distance Transforms and Box Sums Daniel Huttenlocher Computer Science Department December,
Past Tense Probe. Past Tense Probe Past Tense Probe – Practice 1.
Limits (Algebraic) Calculus Fall, What can we do with limits?
Properties of Exponents
Addition 1’s to 20.
25 seconds left…...
Test B, 100 Subtraction Facts
11 = This is the fact family. You say: 8+3=11 and 3+8=11
Week 1.
CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
O X Click on Number next to person for a question.
Compositing and Blending Ed Angel Professor Emeritus of Computer Science University of New Mexico 1 E. Angel and D. Shreiner: Interactive Computer Graphics.
Computer Vision Lecture 7: The Fourier Transform
I Images as graphs Fully-connected graph – node for every pixel – link between every pair of pixels, p,q – similarity w ij for each link j w ij c Source:
電腦視覺 Computer and Robot Vision I Chapter2: Binary Machine Vision: Thresholding and Segmentation Instructor: Shih-Shinh Huang 1.
M.S. Student, Hee-Jong Hong
Efficient Inference for Fully-Connected CRFs with Stationarity
Learning to Detect A Salient Object Reporter: 鄭綱 (3/2)
Robust Higher Order Potentials For Enforcing Label Consistency
P 3 & Beyond Solving Energies with Higher Order Cliques Pushmeet Kohli Pawan Kumar Philip H. S. Torr Oxford Brookes University CVPR 2007.
MRF Labeling With Graph Cut CMPUT 615 Nilanjan Ray.
Belief Propagation Kai Ju Liu March 9, Statistical Problems Medicine Finance Internet Computer vision.
Measuring Uncertainty in Graph Cut Solutions Pushmeet Kohli Philip H.S. Torr Department of Computing Oxford Brookes University.
Takuya Matsuo, Norishige Fukushima and Yutaka Ishibashi
Segmentation of Vehicles in Traffic Video Tun-Yu Chiang Wilson Lau.
Mean field approximation for CRF inference
Summary of “Efficient Deep Learning for Stereo Matching”
Prof. Adriana Kovashka University of Pittsburgh April 4, 2017
A Gentle Introduction to Bilateral Filtering and its Applications
Efficient Graph Cut Optimization for Full CRFs with Quantized Edges
Presentation transcript:

Mean-Field Theory and Its Applications In Computer Vision2

Dense CRF construction Problem Formulation Grid CRF leads to over smoothing around boundaries Dense CRF is able to recover fine boundaries Grid CRF construction Dense CRF construction

Long Range Interaction Able to recover proper flow for objects Teddy arms recovered using Global interaction Optical flow Optical flow and stereo reconstruction image Local interaction Global interaction Ground truth

Very Expensive Step (O(n2)) Marginal Update Marginal Update for large neighbourhood: Very Expensive Step (O(n2))

Inference in Dense CRF Time complexity increases Neighbourhood size MCMC takes 36 hours on 50K variables Graph-cuts based algorithm takes hours

Inference in Dense CRF Time complexity increases Neighbourhood size MCMC takes 36 hours on 50K variables Graph-cuts based algorithm takes hours Not practical for vision applications

Inference in Dense CRF Time complexity increases Neighbourhood size MCMC takes 36 hours on 50K variables Graph-cuts based algorithm takes hours Filter-based Mean-field Inference takes 0.2 secs Possibility of development of many exciting vision applications

Efficient inference Assume Gaussian pairwise weight Label compatibility function

Efficient inference Assume Gaussian pairwise weight Mixture of Gaussians Spatial Bilateral

Bilateral filter output input output input reproduced from [Durand 02]

Marginal update Assume Gaussian pairwise weight

Very Expensive Step (O(n2)) How does it work Very Expensive Step (O(n2))

Message passing from all Xj to all Xi Accumulates weights from all other pixels except itself

Message passing from all Xj to all Xi Convert as Gaussian filtering step: Accumulate weights from all other pixels except itself

Message passing from all Xj to all Xi Convert as Gaussian filtering step: Accumulate weights from all other pixels except itself

Efficient filtering steps Now discuss how to do efficient filtering step