Principle of Locality for Statistical Shape Analysis Paul Yushkevich.

Slides:



Advertisements
Similar presentations
ECG Signal processing (2)
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Lecture Notes for E Alpaydın 2010 Introduction to Machine Learning 2e © The MIT Press (V1.0) ETHEM ALPAYDIN © The MIT Press, 2010
An Introduction of Support Vector Machine
Support Vector Machines
Support Vector Machines and Margins
Search Engines Information Retrieval in Practice All slides ©Addison Wesley, 2008.
CS Statistical Machine learning Lecture 13 Yuan (Alan) Qi Purdue CS Oct
An Overview of Machine Learning
Pattern Recognition and Machine Learning
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Fuzzy Support Vector Machines (FSVMs) Weijia Wang, Huanren Zhang, Vijendra Purohit, Aditi Gupta.
Support Vector Machines (SVMs) Chapter 5 (Duda et al.)
MACHINE LEARNING 9. Nonparametric Methods. Introduction Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 
Classification and risk prediction
Announcements See Chapter 5 of Duda, Hart, and Stork. Tutorial by Burge linked to on web page. “Learning quickly when irrelevant attributes abound,” by.
Lecture #1COMP 527 Pattern Recognition1 Pattern Recognition Why? To provide machines with perception & cognition capabilities so that they could interact.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Comparing Kernel-based Learning Methods for Face Recognition Zhiguo Li
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Introduction to machine learning
1 Comparison of Discrimination Methods for the Classification of Tumors Using Gene Expression Data Presented by: Tun-Hsiang Yang.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
This week: overview on pattern recognition (related to machine learning)
Alignment and classification of time series gene expression in clinical studies Tien-ho Lin, Naftali Kaminski and Ziv Bar-Joseph.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Machine Learning Lecture 11 Summary G53MLE | Machine Learning | Dr Guoping Qiu1.
Using Support Vector Machines to Enhance the Performance of Bayesian Face Recognition IEEE Transaction on Information Forensics and Security Zhifeng Li,
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
Text Classification 2 David Kauchak cs459 Fall 2012 adapted from:
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
1Ellen L. Walker Category Recognition Associating information extracted from images with categories (classes) of objects Requires prior knowledge about.
KNN & Naïve Bayes Hongning Wang Today’s lecture Instance-based classifiers – k nearest neighbors – Non-parametric learning algorithm Model-based.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Prediction of Protein Binding Sites in Protein Structures Using Hidden Markov Support Vector Machine.
CS Statistical Machine learning Lecture 12 Yuan (Alan) Qi Purdue CS Oct
Feature Selction for SVMs J. Weston et al., NIPS 2000 오장민 (2000/01/04) Second reference : Mark A. Holl, Correlation-based Feature Selection for Machine.
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 10: PRINCIPAL COMPONENTS ANALYSIS Objectives:
Intro. ANN & Fuzzy Systems Lecture 16. Classification (II): Practical Considerations.
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Computer Vision Lecture 7 Classifiers. Computer Vision, Lecture 6 Oleh Tretiak © 2005Slide 1 This Lecture Bayesian decision theory (22.1, 22.2) –General.
KNN & Naïve Bayes Hongning Wang
Part 3: Estimation of Parameters. Estimation of Parameters Most of the time, we have random samples but not the densities given. If the parametric form.
1 C.A.L. Bailer-Jones. Machine Learning. Data exploration and dimensionality reduction Machine learning, pattern recognition and statistical data modelling.
Machine Learning Fisher’s Criteria & Linear Discriminant Analysis
k-Nearest neighbors and decision tree
Nonparametric Density Estimation – k-nearest neighbor (kNN) 02/20/17
LECTURE 10: DISCRIMINANT ANALYSIS
Ch8: Nonparametric Methods
Pawan Lingras and Cory Butz
Machine Learning Week 1.
Classification Discriminant Analysis
K Nearest Neighbor Classification
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Classification Discriminant Analysis
REMOTE SENSING Multispectral Image Classification
Computing and Statistical Data Analysis Stat 5: Multivariate Methods
INTRODUCTION TO Machine Learning
Generally Discriminant Analysis
LECTURE 09: DISCRIMINANT ANALYSIS
Concave Minimization for Support Vector Machine Classifiers
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Review of Statistical Pattern Recognition
CAMCOS Report Day December 9th, 2015 San Jose State University
Lecture 16. Classification (II): Practical Considerations
ECE – Pattern Recognition Lecture 10 – Nonparametric Density Estimation – k-nearest-neighbor (kNN) Hairong Qi, Gonzalez Family Professor Electrical.
Presentation transcript:

Principle of Locality for Statistical Shape Analysis Paul Yushkevich

Outline Classification Shape variability with components Principle of Locality Feature Selection for Classification Inter-scale residuals

Classification Given training data belonging to 2 or more classes, construct a classifier. A classifier assigns a class label to each point in space. Classifiers can be: –Parametric vs. Nonparametric –ML vs. MAP –Linear vs. Non-linear

Common Classifiers Fisher Linear Discriminant –Linear, Parametric (Gaussian model with pooled covariance) K Nearest Neighbor, Parzen Windows –Non-linear, non-parametric Support Vector Machines –Non-parametric, linear

Examples Fisher ClassifierNon-linear MLE Classifier 3-Nearest Neighbor

Shape Variability Multiple (Independent) Components –Local and Global –Coarse and Fine

Principle of Locality Heuristic about biological shape: –Each component affects a region of the object –Each component affects a range of scales

Feature selection If we limit classification to just the right subset of features, we can drastically improve the results

Feature Selection Example Underlying distributions Training Set sampled from the distributions Classifier constructed in 2 dimensions Classifier constructed in the relevant y-dimension

Feature Selection Example p=2,  =1p=4,  =1 p=4,  =2p=8,  =2 Expected error rate in classification in p dimensions vs. classification in the relevant  dimensions, plotted against training sample size.

Synthetic Shapes

Feature Selection in Shape Relevant features should form a small set ‘windows’ The order of features is important Example: –Fisher classification rate over all possible windows Window position Window size

Feature Search Algorithm Look for best set of feature windows and best classifier simultaneously Similar to SVMs Minimize: E separation + E Nfeature + E Nwindows