Optimized Nearest Neighbor Methods Cam Weighted Distance vs. Statistical Confidence Robert R. Puckett.

Slides:



Advertisements
Similar presentations
Nonparametric Methods: Nearest Neighbors
Advertisements

Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
DIMENSIONALITY REDUCTION: FEATURE EXTRACTION & FEATURE SELECTION Principle Component Analysis.
K-NEAREST NEIGHBORS AND DECISION TREE Nonparametric Supervised Learning.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Other Classification Techniques 1.Nearest Neighbor Classifiers 2.Support Vector Machines.
An Introduction of Support Vector Machine
2 1 Discrete Markov Processes (Markov Chains) 3 1 First-Order Markov Models.
2 – In previous chapters: – We could design an optimal classifier if we knew the prior probabilities P(wi) and the class- conditional probabilities P(x|wi)
G. Folino, A. Forestiero, G. Spezzano Swarming Agents for Discovering Clusters in Spatial Data Second International.
CHAPTER 8 A NNEALING- T YPE A LGORITHMS Organization of chapter in ISSO –Introduction to simulated annealing –Simulated annealing algorithm Basic algorithm.
Data preprocessing before classification In Kennedy et al.: “Solving data mining problems”
電腦視覺 Computer and Robot Vision I Chapter2: Binary Machine Vision: Thresholding and Segmentation Instructor: Shih-Shinh Huang 1.
Contour Based Approaches for Visual Object Recognition Jamie Shotton University of Cambridge Joint work with Roberto Cipolla, Andrew Blake.
Discriminative and generative methods for bags of features
Instance Based Learning
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
OUTLINE Course description, What is pattern recognition, Cost of error, Decision boundaries, The desgin cycle.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1-NN Rule: Given an unknown sample X decide if for That is, assign X to category if the closest neighbor of X is from category i.
Optimal Bandwidth Selection for MLS Surfaces
Clustering.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Chapter 4 (part 2): Non-Parametric Classification
Introduction to LMNN and progress review M.S. Student, Daewon Ko
Linear Discriminant Functions Chapter 5 (Duda et al.)
Chapter 3 (part 1): Maximum-Likelihood & Bayesian Parameter Estimation  Introduction  Maximum-Likelihood Estimation  Example of a Specific Case  The.
Object Recognition by Parts Object recognition started with line segments. - Roberts recognized objects from line segments and junctions. - This led to.
Data Mining By Andrie Suherman. Agenda Introduction Major Elements Steps/ Processes Tools used for data mining Advantages and Disadvantages.
Methods in Medical Image Analysis Statistics of Pattern Recognition: Classification and Clustering Some content provided by Milos Hauskrecht, University.
Step 3: Classification Learn a decision rule (classifier) assigning bag-of-features representations of images to different classes Decision boundary Zebra.
1 Robust HMM classification schemes for speaker recognition using integral decode Marie Roch Florida International University.
Chapter 13 Genetic Algorithms. 2 Data Mining Techniques So Far… Chapter 5 – Statistics Chapter 6 – Decision Trees Chapter 7 – Neural Networks Chapter.
Automated Patent Classification By Yu Hu. Class 706 Subclass 12.
Pattern Recognition: Baysian Decision Theory Charles Tappert Seidenberg School of CSIS, Pace University.
Outlier Detection Using k-Nearest Neighbour Graph Ville Hautamäki, Ismo Kärkkäinen and Pasi Fränti Department of Computer Science University of Joensuu,
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
Introduction to machine learning and data mining 1 iCSC2014, Juan López González, University of Oviedo Introduction to machine learning Juan López González.
Treatment Learning: Implementation and Application Ying Hu Electrical & Computer Engineering University of British Columbia.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
Nearest Neighbor (NN) Rule & k-Nearest Neighbor (k-NN) Rule Non-parametric : Can be used with arbitrary distributions, No need to assume that the form.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
Lecture 31: Modern recognition CS4670 / 5670: Computer Vision Noah Snavely.
Functional Brain Signal Processing: EEG & fMRI Lesson 7 Kaushik Majumdar Indian Statistical Institute Bangalore Center M.Tech.
© Tan,Steinbach, Kumar Introduction to Data Mining 4/18/ Statistical Inference (By Michael Jordon) l Bayesian perspective –conditional perspective—inferences.
In Defense of Nearest-Neighbor Based Image Classification Oren Boiman The Weizmann Institute of Science Rehovot, ISRAEL Eli Shechtman Adobe Systems Inc.
First topic: clustering and pattern recognition Marc Sobel.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Lecture notes for Stat 231: Pattern Recognition and Machine Learning 1. Stat 231. A.L. Yuille. Fall 2004 AdaBoost.. Binary Classification. Read 9.5 Duda,
Project 11: Determining the Intrinsic Dimensionality of a Distribution Okke Formsma, Nicolas Roussis and Per Løwenborg.
Vaida Bartkutė, Leonidas Sakalauskas
Optimal Dimensionality of Metric Space for kNN Classification Wei Zhang, Xiangyang Xue, Zichen Sun Yuefei Guo, and Hong Lu Dept. of Computer Science &
Covariance matrices for all of the classes are identical, But covariance matrices are arbitrary.
Prototype Classification Methods Fu Chang Institute of Information Science Academia Sinica ext. 1819
Tracking Turbulent 3D Features Lu Zhang Nov. 10, 2005.
Virtual Private Network Pattern Classification Joe Madden Fall 2010 ECE/CS/ME 539.
A shared random effects transition model for longitudinal count data with informative missingness Jinhui Li Joint work with Yingnian Wu, Xiaowei Yang.
Data Mining By: Johan Johansson. Mining Techniques Association Rules Association Rules Decision Trees Decision Trees Clustering Clustering Nearest Neighbor.
Fuzzy Pattern Recognition. Overview of Pattern Recognition Pattern Recognition Procedure Feature Extraction Feature Reduction Classification (supervised)
Debrup Chakraborty Non Parametric Methods Pattern Recognition and Machine Learning.
Non-parametric Methods for Clustering Continuous and Categorical Data Steven X. Wang Dept. of Math. and Stat. York University May 13, 2010.
Shape2Pose: Human Centric Shape Analysis CMPT888 Vladimir G. Kim Siddhartha Chaudhuri Leonidas Guibas Thomas Funkhouser Stanford University Princeton University.
Linear Discriminant Functions Chapter 5 (Duda et al.) CS479/679 Pattern Recognition Dr. George Bebis.
k-Nearest neighbors and decision tree
IMAGE PROCESSING RECOGNITION AND CLASSIFICATION
Statistical Learning Dong Liu Dept. EEIS, USTC.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
An Improved Neural Network Algorithm for Classifying the Transmission Line Faults Slavko Vasilic Dr Mladen Kezunovic Texas A&M University.
Department of Electrical Engineering
Finding Periodic Discrete Events in Noisy Streams
Assignment 1: Classification by K Nearest Neighbors (KNN) technique
Presentation transcript:

Optimized Nearest Neighbor Methods Cam Weighted Distance vs. Statistical Confidence Robert R. Puckett

Cam-Weighted Distance Deforms the distribution by transformation Simulates strengthening and weakening effects between prototypes. k-nearest neighbors used to estimate parameters of the distribution Inverse transform used to provide a “cam weighted distance”

Statistical Confidence Confidence proportional majority value of neighbors. On low confidence choose bigger k An alternative to globally increasing the k value. Algorithm selectively increases the k- value only when the confidence is below some threshold.

Goals Implement NN-Base System Cam-NN Add-on Statistical Confidence Add-on Create hybrid method Test against dataset

Schedule Main Milestones Software development Dataset generation Analysis Report Writing Schedule

References Duda, R. O., P. E. Hart, et al. (2001). Pattern classification. New York, Wiley. Wang, J., P. Neskovic, et al. (2006). "Neighborhood size selection in the k-nearest- neighbor rule using statistical confidence." Pattern Recognition 39 (3): Zhou, C. Y. and Y. Q. Chen (2006). "Improving nearest neighbor classification with cam weighted distance." Pattern Recognition 39 (4):