Fast Pattern Matching. Fast Pattern Matching: Presentation Plan Pattern Matching: Definitions Classic Pattern Matching Algorithms Fast pattern matching.

Slides:



Advertisements
Similar presentations
Distinctive Image Features from Scale-Invariant Keypoints
Advertisements

Object Recognition Using Locality-Sensitive Hashing of Shape Contexts Andrea Frome, Jitendra Malik Presented by Ilias Apostolopoulos.
Noise & Data Reduction. Paired Sample t Test Data Transformation - Overview From Covariance Matrix to PCA and Dimension Reduction Fourier Analysis - Spectrum.
ECE 8443 – Pattern Recognition ECE 8423 – Adaptive Signal Processing Objectives: The Linear Prediction Model The Autocorrelation Method Levinson and Durbin.
Medical Image Registration Kumar Rajamani. Registration Spatial transform that maps points from one image to corresponding points in another image.
Chapter 3 Image Enhancement in the Spatial Domain.
The SIFT (Scale Invariant Feature Transform) Detector and Descriptor
1. A given pattern p is sought in an image. The pattern may appear at any location in the image. The image may be subject to some tone changes. 2 pattern.
TP14 - Local features: detection and description Computer Vision, FCUP, 2014 Miguel Coimbra Slides by Prof. Kristen Grauman.
3D Shape Histograms for Similarity Search and Classification in Spatial Databases. Mihael Ankerst,Gabi Kastenmuller, Hans-Peter-Kriegel,Thomas Seidl Univ.
Fast Algorithm for Nearest Neighbor Search Based on a Lower Bound Tree Yong-Sheng Chen Yi-Ping Hung Chiou-Shann Fuh 8 th International Conference on Computer.
Foreground Modeling The Shape of Things that Came Nathan Jacobs Advisor: Robert Pless Computer Science Washington University in St. Louis.
Image Indexing and Retrieval using Moment Invariants Imran Ahmad School of Computer Science University of Windsor – Canada.
Instructor: Mircea Nicolescu Lecture 13 CS 485 / 685 Computer Vision.
Computer Vision Optical Flow
Young Deok Chun, Nam Chul Kim, Member, IEEE, and Ick Hoon Jang, Member, IEEE IEEE TRANSACTIONS ON MULTIMEDIA,OCTOBER 2008.
Modeling Pixel Process with Scale Invariant Local Patterns for Background Subtraction in Complex Scenes (CVPR’10) Shengcai Liao, Guoying Zhao, Vili Kellokumpu,
A Study of Approaches for Object Recognition
Segmentation Divide the image into segments. Each segment:
Objective of Computer Vision
Computing motion between images
Distinctive image features from scale-invariant keypoints. David G. Lowe, Int. Journal of Computer Vision, 60, 2 (2004), pp Presented by: Shalomi.
Basic Concepts and Definitions Vector and Function Space. A finite or an infinite dimensional linear vector/function space described with set of non-unique.
Highlights Lecture on the image part (10) Automatic Perception 16
Objective of Computer Vision
2D Fourier Theory for Image Analysis Mani Thomas CISC 489/689.
Smart Traveller with Visual Translator for OCR and Face Recognition LYU0203 FYP.
Transforms: Basis to Basis Normal Basis Hadamard Basis Basis functions Method to find coefficients (“Transform”) Inverse Transform.
Face Recognition Using Neural Networks Presented By: Hadis Mohseni Leila Taghavi Atefeh Mirsafian.
Image Representation Gaussian pyramids Laplacian Pyramids
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2014.
Multimedia and Time-series Data
Image Processing in Freq. Domain Restoration / Enhancement Inverse Filtering Match Filtering / Pattern Detection Tomography.
Machine Vision for Robots
: Chapter 12: Image Compression 1 Montri Karnjanadecha ac.th/~montri Image Processing.
CS654: Digital Image Analysis Lecture 3: Data Structure for Image Analysis.
Overview Harris interest points Comparing interest points (SSD, ZNCC, SIFT) Scale & affine invariant interest points Evaluation and comparison of different.
CSE554AlignmentSlide 1 CSE 554 Lecture 5: Alignment Fall 2011.
INDEPENDENT COMPONENT ANALYSIS OF TEXTURES based on the article R.Manduchi, J. Portilla, ICA of Textures, The Proc. of the 7 th IEEE Int. Conf. On Comp.
Texture. Texture is an innate property of all surfaces (clouds, trees, bricks, hair etc…). It refers to visual patterns of homogeneity and does not result.
Motion Segmentation By Hadas Shahar (and John Y.A.Wang, and Edward H. Adelson, and Wikipedia and YouTube) 1.
A Two-level Pose Estimation Framework Using Majority Voting of Gabor Wavelets and Bunch Graph Analysis J. Wu, J. M. Pedersen, D. Putthividhya, D. Norgaard,
CVPR 2003 Tutorial Recognition and Matching Based on Local Invariant Features David Lowe Computer Science Department University of British Columbia.
December 9, 2014Computer Vision Lecture 23: Motion Analysis 1 Now we will talk about… Motion Analysis.
1 University of Texas at Austin Machine Learning Group 图像与视频处理 计算机学院 Motion Detection and Estimation.
Lecture 7: Features Part 2 CS4670/5670: Computer Vision Noah Snavely.
CSE554AlignmentSlide 1 CSE 554 Lecture 8: Alignment Fall 2013.
Digital Image Processing Lecture 10: Image Restoration
1 Research Question  Can a vision-based mobile robot  with limited computation and memory,  and rapidly varying camera positions,  operate autonomously.
Tony Jebara, Columbia University Advanced Machine Learning & Perception Instructor: Tony Jebara.
1 Motion Analysis using Optical flow CIS601 Longin Jan Latecki Fall 2003 CIS Dept of Temple University.
Machine Vision Edge Detection Techniques ENT 273 Lecture 6 Hema C.R.
Computer vision. Applications and Algorithms in CV Tutorial 3: Multi scale signal representation Pyramids DFT - Discrete Fourier transform.
Robotics Chapter 6 – Machine Vision Dr. Amit Goradia.
Instructor: Mircea Nicolescu Lecture 5 CS 485 / 685 Computer Vision.
Distinctive Image Features from Scale-Invariant Keypoints Presenter :JIA-HONG,DONG Advisor : Yen- Ting, Chen 1 David G. Lowe International Journal of Computer.
CSE 554 Lecture 8: Alignment
Interest Points EE/CSE 576 Linda Shapiro.
Lecture 07 13/12/2011 Shai Avidan הבהרה: החומר המחייב הוא החומר הנלמד בכיתה ולא זה המופיע / לא מופיע במצגת.
TP12 - Local features: detection and description
Machine Learning Basics
Feature description and matching
Fast and Robust Object Tracking with Adaptive Detection
CAP 5415 Computer Vision Fall 2012 Dr. Mubarak Shah Lecture-5
Fast Algorithms for Walsh Hadamard Transform on Sliding Windows
The SIFT (Scale Invariant Feature Transform) Detector and Descriptor
Image and Video Processing
Feature descriptors and matching
Presentation transcript:

Fast Pattern Matching

Fast Pattern Matching: Presentation Plan Pattern Matching: Definitions Classic Pattern Matching Algorithms Fast pattern matching algorithms

Pattern Matching: Definition Pattern Matching: –The problem of locating a specific pattern inside raw data. Pattern: –A collection of strings described in some formal language.

Image Pattern Matching: Definition Pattern Matching: –Finding occurrences of a particular pattern in an image. Pattern: –Typically a 2D image fragment. –Much smaller than the image.

Pattern Matching: Variations Detecting a set of patterns in a single image: –A set of distinct patterns. –A set of transformations of a single pattern. Finding a particular pattern in a set of images: –Example: frames in a video sequence.

Pattern Matching: Main Tasks Search in Spatial Domain: –The pattern may appear in any location in the image. Search in Transformation Domain: –The pattern may be subjected to any transformation (in a transformation group).

Image Similarity Measures Image Similarity Measure: –A function that assigns a nonnegative real value to two given images. –Small measure  high similarity d( - ) ≥ 0 –Can be combined with thresholding.

Image Similarity Measure: The Euclidean Distance Pros: –Computed fast by convolution. –Allows reducing the representations of linear spaces using Principal Component Analysis (PCA) approaches. PCA preserves Euclidean distance. Cons: –Not in accord with human perception. –May change drastically when a small transformation is applied.

Pattern Matching: Applications Image Compression Video Compression Optical Character Recognition (OCR) Medical Image Registration Fingerprint Identification …

Pattern Matching: Real Time Applications Robot Vision Vehicle tracking Surveillance Industrial Inspection: –Automatic Part Inspection –Automatic Circuit Inspection

Fast Pattern Matching Pattern Matching: Definition Classical Pattern Matching Algorithms Fast pattern matching algorithms: –Search in Spatial Domain –Search in Transformation Domain

The “Naïve” Approach Scan the entire image, pixel by pixel. For each pixel, evaluate the similarity between its local neighborhood and the pattern.

Naïve Approach using Euclidean Distance Given: –k×k pattern P(x,y) –n×n image I(x,y) For each pixel (x,y), we compute the distance: Complexity:

Improvement: FFT Fixed Convolution can be applied rapidly using FFT. Complexity: Over all pixels:

Naïve and FFT Approaches: Performance NaïveFFT Time Complexity Space Integer ArithmeticYesNo Run time for 16× Sec.3.5 Sec. Run time for 32× Sec.3.5 Sec. Run time for 64× Sec.3.5 Sec. Performance table for a 1024×1024 image, on a 1.8 GHz PC: Far too long for real-time applications!

Normalized Grayscale Correlation NGC: –A similarity measure, based on a normalized cross- correlation function. –Maps two given images to [0,1] (absolute value). –Identical images are mapped to 1. Given two images I and T, NGC is computed as follows:

Pattern Matching using NGC += Pattern Matching using NGC: –Create a correlation map: Shift the pattern over the image. For each position, compute NGC(pattern,window). –Find peaks in the correlation map.

Pattern Matching using NGC: Evaluation Pros: –Very accurate: By interpolating the correlation map, a pattern can be located in 1/16 pixel accuracy. –Invariant to linear changes in brightness. Cons: –Computationally expensive: Matching k×k pattern with n×n image takes O(n 2 k 2 ). –Limited tolerance to variations in scale, orientation and illumination.

Multi Resolution Approach Pattern Matching using a Gaussian Pyramid First suggested by: Burt & Adelson (1984)

Multi Resolution Approach: Scale Invariant Search The same pattern is searched in each pyramid level. problems: –Computationally expensive. –Limited scale resolution. Adelson, Anderson, Bergen, Burt, Ogden, “Pyramid methods in image processing” (1984)

Fast Pattern Matching Pattern Matching: Definition Classical Pattern Matching Algorithms Fast pattern matching algorithms: –Search in Spatial Domain –Search in Transformation Domain

Gradient Descent Search in an Image Pyramid A technique for fast search in the spatial domain Suggested by: Maclean & Tsotsos (2000) MacLean, Tsotsos, “Fast pattern recognition using gradient-descent search in an image pyramid” (2000)

Multi Resolution Schemes: Gradient Descent Search Create pyramid for pattern and image. Compute NGC between top level image and pattern. MacLean, Tsotsos, “Fast pattern recognition using gradient-descent search in an image pyramid” (2000) –Find candidates using thresholding. –For each candidate, estimate location in next level. –Refine location by correlation gradient descent, using estimated location as starting point. Iterate search until top-level

Gradient Descent Search: Evaluation Pros: –Full NGC is performed only in top level (small). –The pattern pyramid is created off-line. Cons: –All inherent NGC limitations: Only 8-10% invariance to scale and rotation. MacLean, Tsotsos, “Fast pattern recognition using gradient-descent search in an image pyramid” (2000)

Heuristics: Focus of Attention Assumptions on the location of the pattern within the image. Used for motion estimation in video compression. Divide a frame into blocks In the next frame, search each block within its surrounding area Code only motion vectors for each block

Pattern Matching using Projection Kernels Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) A new technique for fast search in the spatial domain Suggested by: Y. Hel-Or and H. Hel-Or (2002)

Motivation Euclidean distance as a similarity measure. Fast computation of a lower bound on the distance. Fast rejection of non-matching windows.

Distance Measure in Sub-space Given: –k×k pattern p. –k×k image window w. Both w and p can be represented as vectors in R kxk : || - || 2 Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Distance Measure in Sub-space (cont.) Assume that p and w are not given. However, we know their projection values onto a unit vector u. From the Cauchy-Schwarz Inequality: u p w Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Distance Measure in Sub-space (cont.) If we project p and w onto two unit vectors u and v, we can tighten the lower bound: u p w v Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Distance Measure in Sub-space (cont.) By projecting on a set of mutually orthonormal vectors U: When U forms a basis, LB = Euclidean distance. u p w v Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Choosing Projection Kernels A good choice of projection kernels U can expedite the distance calculations. Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) –Projection on U should be fast to apply. Natural Images u1u1 –U should have high probability of being parallel to the vectors p - w. The first few kernels should capture a large proportion of the distance. u p w

The Walsh-Hadamard Kernels Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) The first 2 8 kernel vectors. Displayed in order of increasing spatial frequency.

Projection Kernels: Walsh-Hadamard Properties: –Fast Transform calculations: The basis vectors contain only  1. Transform calculations require only integer additions and subtractions. –First few vectors capture high proportion of image energy. Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Lower bound on distance between pattern and window vs. Number of projection kernels used The Walsh-Hadamard Kernels (cont.) Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) Walsh-HadamardStandard basis

Efficient Transformation Computation The fast transformation is due to: –Computation of one window is exploited in its neighbor. –WH kernels have a recursive structure: Projection on one kernel is exploited in the next kernel. –Projection onto the first few WH kernels is usually sufficient. Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

The Walsh-Hadamard Tree (1D) Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) A tree for computing projections onto first 8 WH basis vectors: The root: Contains the original signal A node: Contains intermediate computations The i’th leaf: Contains a vector of projection values of all signal windows onto the i’th WH kernel Tree height = log8

The Walsh-Hadamard Tree (2D) Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) In 2D: –The projections are performed in a similar manner. –Height of the tree: log(k)  2log(k). –Number of leaves: k  k 2

The Pattern Matching Algorithm 1) Project all image windows on the first WH kernel. –Find a lower bound on the distance between the pattern and each window. 2) Reject all windows whose lower bound exceeds a given threshold. 3) Project only the remaining windows on the next kernel. –Tighten the lower bounds of the remaining windows. 4) Return to step (2), unless either: –All k kernels have been processed. –The number of remaining windows reached a predefined value. Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) Iterate

Performance Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) Average ops/pixel SpaceInteger Arithmetic Run Time Naive O(k 2 )n2n2 Yes4.86 Sec FFT O(logn)n2n2 No3.5 Sec Projection Kernels O(logk)n 2 log k Yes78 MSec Performance table for a 1024×1024 image and a 32×32 pattern. Performed on a 1.8 GHz PC.

Example Image (256×256) Pattern (16×16) Initially: candidates Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Example (cont.) Image Pattern After the 1 st projection: 563 candidates Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Example (cont.) Image Pattern After the 2 nd projection: 16 candidates Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Example (cont.) Image Pattern After the 3 rd projection: 1 candidate Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Tolerance to Noise OriginalNoise Level = 40 Detected Patterns Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) Results of the algorithm when applied to a noisy image: Requires a large rejection threshold:  More candidates  Longer running time

Illumination Invariance The DC of all windows is given by projection on the first WH kernel. Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) We can disregard the DC by skipping the first kernel. The rest of the process continues as before.

Illumination Invariance (cont.) Original Image Illumination gradient added Detected patterns Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002) Results of the algorithm (without DC): Only 5 projections were performed.

P. Matching using Projection Kernels: Summary Fast projection computations. Fast rejection of non-pattern windows. Invariant to: –Noise –Illumination variations Y. Hel-Or, H. Hel-Or, "Real Time Pattern Matching Using Projection Kernels“ (2002)

Fast Pattern Matching Pattern Matching: Definition Classical Pattern Matching Algorithms Fast pattern matching algorithms: –Search in Spatial Domain –Search in Transformation Domain

Search in Transformation Domain Represent a k×k pattern P as a point in  kxk : Let T(α)P be a transformation T(α) applied to P:   kxk P =P = T(α)P = T(α)P for all α forms an orbit in  kxk : T(  )P T(  0 )P T(  2 )P T(  1 )P P

Transformation Domain: Example A pattern under 2D rotations in Euclidean space. Sampled at equal rotation angles. Projected on its three most significant components. Similarly perceived patterns may be distant in Pattern Space!

Search in Transformation Domain: Problem Formulation Given: –Distance metric d(Q,P) –k×k image window W We want to find:  (W,P) = min  d(W,T(  )P) W P  (W,P) If  (W,P) < threshold, W matches P.

Search in Transformation Domain: Approaches Orbit Simplification Dimensionality Reduction Fast Search Orbit Decomposition

Orbit Simplification: Transformation Invariant Functions First suggested by Hu (1961). Find a function over the pattern space, which is constant for some transformation parameters. M. Hu, “Pattern recognition by moment invariants” (1961)

Transformation Invariant Functions (cont.) Create a blob image (by thresholding). A blob enhances low frequency components: –Size –Orientation –Low-level shape M. Hu, “Pattern recognition by moment invariants” (1961) Hu derived a set of functions, based on the central blob’s moments. Function’s output is independent of: –Translation –Rotation –Reflection

Dimensionality Reduction Reduce the dimensionality of the pattern space: –Create sparse representation using Wavelets and DCT transform: Energy of natural patterns is concentrated in few coefficients. –Principal Component Analysis (PCA): Find a reduced linear basis for the pattern space. Search the reduced space.

Fast Exhaustive Search in the Transformation Domain Apply an exhaustive search in some of the transformation domains. Requires a fast search technique. Example: –Fast search for different translations and scales using: Pyramidal representation. Fast implementation of convolution.

Pattern Matching using Orbit Decomposition A new technique for fast search in a pattern orbit Suggested by: Y. Hel-Or and H. Hel-Or (2002) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Orbit Distance Recall: –d(Q,P) is a distance metric, where Q,P  n. –We want to find the orbit distance:  (Q,P) = min  d(Q,T(  )P) Q P R Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition” But: –In the general case,  (Q,P) is not a metric. It doesn’t satisfy the triangle inequality:

Orbit Distance (cont.) Observe: –If the distance d(Q,P) is transformation invariant: d(Q,P)= d(T(  )Q, T(  )P) Then: Q P S Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition” 1)  (Q,P) is a metric. 2)point-to-orbit distance is equal to orbit-to-orbit distance.

Orbit Decomposition In practice T(  ) is sampled into T(  i) = T  (i), i=1,2,… Q P (Q,P)(Q,P) T(i)PT(i)P  P’ We can divide T  (i)P into two sub-orbits: T 2  (i)P and T 2  (i)P’ where P’= T  (1)P 22 T2(i)PT2(i)P Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Orbit Decomposition (cont.) T2(i)PT2(i)PT2(i)P’T2(i)P’ Q T(i)PT(i)P   (Q,P) P P’  2  (Q,P)  2  (Q,P’) P P’ Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Orbit Decomposition (cont.) T2(i)PT2(i)PT2(i)P’T2(i)P’ Q  2  (Q,P)  2  (Q,P’) P P’ Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”  2  (P,P’)  2  is a metric  2  (P,P’) can be calculated in advance We can save calculations using the triangle inequality constraint

Orbit Decomposition (cont.) The decomposition can be applied recursively: Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Orbit Tree Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition” Threshold = 100

Example Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Example (cont.) Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition”

Rejection Rate # Distance Calculations % Image Pixels Remaining Y. Hel-Or, H. Hel-Or, “Generalized Pattern Matching using Orbit Decomposition” Average number of distance computations per pixel is 2.868

P. Matching using Orbit Decomposition: Summary Orbit distance is a metric –When point distance is transformation invariant. Fast search in orbit distance space –Using recursive orbit decomposition. Fast rejection of distant patterns.