Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing.

Slides:



Advertisements
Similar presentations
Local Search Algorithms
Advertisements

Frank Wood - Training Products of Experts by Minimizing Contrastive Divergence Geoffrey E. Hinton presented by Frank Wood.
Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Neural and Evolutionary Computing - Lecture 4 1 Random Search Algorithms. Simulated Annealing Motivation Simple Random Search Algorithms Simulated Annealing.
Contextual Classification by Melanie Ganz Lecture 6, Medical Image Analysis 2008.
Optimization methods Morten Nielsen Department of Systems biology, DTU.
ECE 472/572 - Digital Image Processing Lecture 8 - Image Restoration – Linear, Position-Invariant Degradations 10/10/11.
Local search algorithms
Local search algorithms
Two types of search problems
CPSC 322, Lecture 15Slide 1 Stochastic Local Search Computer Science cpsc322, Lecture 15 (Textbook Chpt 4.8) February, 6, 2009.
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
1 Markov random field: A brief introduction Tzu-Cheng Jen Institute of Electronics, NCTU
Announcements Readings for today:
Lecture 11: Stereo and optical flow CS6670: Computer Vision Noah Snavely.
Informed Search Chapter 4 Adapted from materials by Tim Finin, Marie desJardins, and Charles R. Dyer CS 63.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
By Rohit Ray ESE 251.  Most minimization (maximization) strategies work to find the nearest local minimum  Trapped at local minimums (maxima)  Standard.
Advanced Image Processing Image Relaxation – Restoration and Feature Extraction 02/02/10.
Introduction to Monte Carlo Methods D.J.C. Mackay.
D Goforth - COSC 4117, fall Note to 4 th year students  students interested in doing masters degree and those who intend to apply for OGS/NSERC.
Lecture 12 Modules Employing Gradient Descent Computing Optical Flow Shape from Shading.
2-D Steady Heat Equation x y y=b y=0 x=0 x=a T(x,b)=f(x)
The Basics and Pseudo Code
Local Search Algorithms This lecture topic Chapter Next lecture topic Chapter 5 (Please read lecture topic material before and after each lecture.
Digital Image Processing
Simulated Annealing.
Markov Random Fields Probabilistic Models for Images
Markov Chain Monte Carlo and Gibbs Sampling Vasileios Hatzivassiloglou University of Texas at Dallas.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
A Computational Study of Three Demon Algorithm Variants for Solving the TSP Bala Chandran, University of Maryland Bruce Golden, University of Maryland.
Bayesian Reasoning: Tempering & Sampling A/Prof Geraint F. Lewis Rm 560:
CHAPTER 4, Part II Oliver Schulte Summer 2011 Local Search.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 18 Learning Boltzmann Machines Geoffrey Hinton.
02/17/10 CSCE 769 Optimization Homayoun Valafar Department of Computer Science and Engineering, USC.
Chapter 10 Minimization or Maximization of Functions.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Heuristic Methods for the Single- Machine Problem Chapter 4 Elements of Sequencing and Scheduling by Kenneth R. Baker Byung-Hyun Ha R2.
Motion Estimation using Markov Random Fields Hrvoje Bogunović Image Processing Group Faculty of Electrical Engineering and Computing University of Zagreb.
A global approach Finding correspondence between a pair of epipolar lines for all pixels simultaneously Local method: no guarantee we will have one to.
Graph Algorithms for Vision Amy Gale November 5, 2002.
Advanced Computer Graphics Optimization Part 2 Spring 2002 Professor Brogan.
Lecture 6 – Local Search Dr. Muhammad Adnan Hashmi 1 24 February 2016.
Local Search Algorithms and Optimization Problems
Lecture 18, CS5671 Multidimensional space “The Last Frontier” Optimization Expectation Exhaustive search Random sampling “Probabilistic random” sampling.
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) Oct, 11, 2013.
CS623: Introduction to Computing with Neural Nets (lecture-17) Pushpak Bhattacharyya Computer Science and Engineering Department IIT Bombay.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
Ch. Eick: Randomized Hill Climbing Techniques Randomized Hill Climbing Neighborhood Hill Climbing: Sample p points randomly in the neighborhood of the.
CS621: Artificial Intelligence
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
6.8 Maximizer of the Posterior Marginals 6.9 Iterated Conditional Modes of the Posterior Distribution Jang, HaYoung.
Optimization via Search
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
Heuristic Optimization Methods
CS621: Artificial Intelligence
Segmentation Using Metropolis Algorithm
Haim Kaplan and Uri Zwick
Neural Networks for Vertex Covering
CSE 589 Applied Algorithms Spring 1999
Introduction to Neural Networks
More on Search: A* and Optimization
Outline Texture modeling - continued Julesz ensemble.
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Artificial Intelligence
Simulated Annealing & Boltzmann Machines
Algorithms Lecture # 26 Dr. Sohail Aslam.
Stochastic Methods.
Presentation transcript:

Lecture 9 State Space Gradient Descent Gibbs Sampler with Simulated Annealing

2 Continuous Space Gradient Descent E (a, b, c) Step size  has to be small 0.01 might not work begins to work

3 State Space Gradient Descent For each pixel x,y For each possible state S if F(x,y) = 0, E 0 =... if F(x,y) = 1, E 1 =.... if F(x,y) = 255, E 255 = … Select state with minimum E Let F(x,y) = S Repeat Until no change in E

4 State Space Gradient Descent F 0 (x,y) F 1 (x,y) RANDOM

5 Energy means Global Energy x-1,y x,y-1 x,y Change in (x,y) = change in E of (x,y), (x-1,y), (x,y-1) Clique - neighborhood affected

6 State Space Gradient Descent (x,y-1) (x-1,y) (x-1,y-1) (x,y)

7 State Space Gradient Descent F 0 (x,y)E 0 (x,y) F 1 (x,y) E 1 (x,y) F t (x,y)

8 State Space Gradient Descent Greedy - Takes best step Problem - Stuck at local minimum

9 Gradient Descent Algorithm 1. Initialize F 0 (x,y) = Random For each pixel (x,y) For each state S = if F(x,y) = 0, E 0 =... if F(x,y) = 1, E 1 =.... if F(x,y) = 255, E 255 = … Choose state with minimum E, F(x,y) = S 3. Repeat step 2 until E is stable (not decrease)

10 Gibbs Sampler 1. Start Temperature T is high 2. Initialize F 0 (x,y) = Random For each pixel (x,y) For each state S = if F(x,y) = 0; E 0 =... ; P 0 =... if F(x,y) = 1; E 1 = … ; P 1 =.... if F(x,y) = 255; E 255 = …; P 255 =... For each P i = P i /sum(P i ) 4. Sample for state S from pdf P i F(x,y) = S 5. Reduce T = T * Repeat step 3-4 while E is not stable

11 Gibbs Sampler 3 issues - 1) From E, how to get P 2) How to sample from PDF 3) Why reduce T, what is T for?

12 Gibbs Sampler: Find P P i  E - Energy proportional normalizing factor Rand(0..1) =.43 F(x,y) = 3

13 Gibbs Sampler : Why T? T High - Random Walk T Low - Approach o (Gradient Descent) 20%

14 Analyze States = 3 E(0) = 2E(1) = 3E(2) = 4 Gibbs Sampler : Why T? T=100P =.98P =.97P =.96 P =.33P =.33P =.33 T=10P =.82P =.74P =.67 P =.36P =.33P =.31 T=100P = 2.06*10 -9 P = 9.3* P = 4.2* P =.999P =.00001P = T controls distribution of PDF

15 Why slowly reduce T gets global min?