DATA MINING Ronald Westra Dep. Mathematics

Slides:



Advertisements
Similar presentations
This algorithm is used for dimension reduction. Input: a set of vectors {Xn є }, and dimension d,d
Advertisements

Introduction to Support Vector Machines (SVM)
VC Dimension – definition and impossibility result
Support Vector classifiers for Land Cover Classification Mahesh Pal Paul M. Mather National Institute of tecnology School of geography Kurukshetra University.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 24: Non-linear Support Vector Machines Geoffrey Hinton.
Lecture 9 Support Vector Machines
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
CS276A Text Retrieval and Mining Lecture 17 [Borrows some slides from Ray Mooney]
SVM - Support Vector Machines A new classification method for both linear and nonlinear data It uses a nonlinear mapping to transform the original training.
Machine Learning Week 3 Lecture 1. Programming Competition
SVM—Support Vector Machines
Ping-Tsun Chang Intelligent Systems Laboratory Computer Science and Information Engineering National Taiwan University Text Mining with Machine Learning.
DATA MINING van data naar informatie Ronald Westra Dep. Mathematics Maastricht University.
Data mining in 1D: curve fitting
COMP 328: Final Review Spring 2010 Nevin L. Zhang Department of Computer Science & Engineering The Hong Kong University of Science & Technology
TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: AAAA A.
By : L. Pour Mohammad Bagher Author : Vladimir N. Vapnik
The Nature of Statistical Learning Theory by V. Vapnik
Announcements See Chapter 5 of Duda, Hart, and Stork. Tutorial by Burge linked to on web page. “Learning quickly when irrelevant attributes abound,” by.
Measuring Model Complexity (Textbook, Sections ) CS 410/510 Thurs. April 27, 2007 Given two hypotheses (models) that correctly classify the training.
Computational Learning Theory
Support Vector Machines Based on Burges (1998), Scholkopf (1998), Cristianini and Shawe-Taylor (2000), and Hastie et al. (2001) David Madigan.
Principle of Locality for Statistical Shape Analysis Paul Yushkevich.
Binary Classification Problem Linearly Separable Case
Sketched Derivation of error bound using VC-dimension (1) Bound our usual PAC expression by the probability that an algorithm has 0 error on the training.
SVM Support Vectors Machines
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
SVM (Support Vector Machines) Base on statistical learning theory choose the kernel before the learning process.
Statistical Learning Theory: Classification Using Support Vector Machines John DiMona Some slides based on Prof Andrew Moore at CMU:
An SVM Based Voting Algorithm with Application to Parse Reranking Paper by Libin Shen and Aravind K. Joshi Presented by Amit Wolfenfeld.
1 Introduction to Support Vector Machines for Data Mining Mahdi Nasereddin Ph.D. Pennsylvania State University School of Information Sciences and Technology.
Fundamentals of machine learning 1 Types of machine learning In-sample and out-of-sample errors Version space VC dimension.
Exploring a Hybrid of Support Vector Machines (SVMs) and a Heuristic Based System in Classifying Web Pages Santa Clara, California, USA Ahmad Rahman, Yuliya.
1 SUPPORT VECTOR MACHINES İsmail GÜNEŞ. 2 What is SVM? A new generation learning system. A new generation learning system. Based on recent advances in.
ECE 8443 – Pattern Recognition Objectives: Error Bounds Complexity Theory PAC Learning PAC Bound Margin Classifiers Resources: D.M.: Simplified PAC-Bayes.
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
Computational Learning Theory IntroductionIntroduction The PAC Learning FrameworkThe PAC Learning Framework Finite Hypothesis SpacesFinite Hypothesis Spaces.
SUPPORT VECTOR MACHINES. Intresting Statistics: Vladmir Vapnik invented Support Vector Machines in SVM have been developed in the framework of Statistical.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Dec 21, 2006For ICDM Panel on 10 Best Algorithms Support Vector Machines: A Survey Qiang Yang, for ICDM 2006 Panel Partially.
Support Vector Machines
Support Vector Machines Tao Department of computer science University of Illinois.
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
Goal of Learning Algorithms  The early learning algorithms were designed to find such an accurate fit to the data.  A classifier is said to be consistent.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Support-Vector Networks C Cortes and V Vapnik (Tue) Computational Models of Intelligence Joon Shik Kim.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Part 9: Review.
Machine Learning in CSC 196K
ECE 471/571 – Lecture 22 Support Vector Machine 11/24/15.
CSC321: Introduction to Neural Networks and Machine Learning Lecture 23: Linear Support Vector Machines Geoffrey Hinton.
Fundamentals of machine learning 1 Types of machine learning In-sample and out-of-sample errors Version space VC dimension.
CS 9633 Machine Learning Support Vector Machines
Support Vector Machine 04/26/17
Support Vector Machines
CH. 2: Supervised Learning
ASSIGNMENT NO.-2.
Machine Learning Week 1.
Pattern Recognition CS479/679 Pattern Recognition Dr. George Bebis
Support Vector Machines
INF 5860 Machine learning for image classification
Computational Learning Theory
The following slides are taken from:
Computational Learning Theory
Label Name Label Name Label Name Label Name Label Name Label Name
Generalization in deep learning
Supervised machine learning: creating a model
Types of Errors And Error Analysis.
Support Vector Machines 2
Presentation transcript:

DATA MINING Ronald Westra Dep. Mathematics from data to information Ronald Westra Dep. Mathematics Maastricht University 7 December, 2006

SUPPORT VECTOR MACHINES Theory

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

The VC dimension (Vapnik Chervonenkis dimension) is a measure of the capacity of a statistical classification algorithm. Consider a classification model f with some parameter vector θ. The model f can shatter a set of data points if, for all assignments of labels to those data points, there exists a θ such that the model f makes no errors when evaluating that set of data points. The VC dimension of a model f is the maximum h such that some data point set of cardinality h can be shattered by f. Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

Support Vector Machines

The End