Sparsity-Eager SVM for Audio Classification Kamelia Aryafar & Ali Shokoufandeh Department of Computer Science Drexel University, Philadelphia, PA Presented.

Slides:



Advertisements
Similar presentations
Statistical Machine Learning- The Basic Approach and Current Research Challenges Shai Ben-David CS497 February, 2007.
Advertisements

Generative Models Thus far we have essentially considered techniques that perform classification indirectly by modeling the training data, optimizing.
Regularized risk minimization
Lecture 9 Support Vector Machines
VC theory, Support vectors and Hedged prediction technology.
Classification / Regression Support Vector Machines
Pattern Recognition Ku-Yaw Chang
SVM—Support Vector Machines
CSCI 347 / CS 4206: Data Mining Module 07: Implementations Topic 03: Linear Models.
Semi-supervised Learning Rong Jin. Semi-supervised learning  Label propagation  Transductive learning  Co-training  Active learning.
Introduction to Boosting Aristotelis Tsirigos SCLT seminar - NYU Computer Science.
1 Computational Learning Theory and Kernel Methods Tianyi Jiang March 8, 2004.
What is Learning All about ?  Get knowledge of by study, experience, or being taught  Become aware by information or from observation  Commit to memory.
Theory Simulations Applications Theory Simulations Applications.
Statistical Learning Theory: Classification Using Support Vector Machines John DiMona Some slides based on Prof Andrew Moore at CMU:
Optimization Theory Primal Optimization Problem subject to: Primal Optimal Value:
Review Rong Jin. Comparison of Different Classification Models  The goal of all classifiers Predicating class label y for an input x Estimate p(y|x)
Ch. Eick: Support Vector Machines: The Main Ideas Reading Material Support Vector Machines: 1.Textbook 2. First 3 columns of Smola/Schönkopf article on.
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Efficient Model Selection for Support Vector Machines
Machine Learning CSE 681 CH2 - Supervised Learning.
Feature Selection in Nonlinear Kernel Classification Olvi Mangasarian Edward Wild University of Wisconsin Madison.
Support Vector Machine (SVM) Based on Nello Cristianini presentation
Classification and Ranking Approaches to Discriminative Language Modeling for ASR Erinç Dikici, Murat Semerci, Murat Saraçlar, Ethem Alpaydın 報告者:郝柏翰 2013/01/28.
Overview of Supervised Learning Overview of Supervised Learning2 Outline Linear Regression and Nearest Neighbors method Statistical Decision.
SVM Support Vector Machines Presented by: Anas Assiri Supervisor Prof. Dr. Mohamed Batouche.
Stochastic Subgradient Approach for Solving Linear Support Vector Machines Jan Rupnik Jozef Stefan Institute.
CS Statistical Machine learning Lecture 18 Yuan (Alan) Qi Purdue CS Oct
1 Chapter 6. Classification and Prediction Overview Classification algorithms and methods Decision tree induction Bayesian classification Lazy learning.
Hierarchical Classification
Christopher M. Bishop, Pattern Recognition and Machine Learning.
CS 478 – Tools for Machine Learning and Data Mining SVM.
Sparse Kernel Methods 1 Sparse Kernel Methods for Classification and Regression October 17, 2007 Kyungchul Park SKKU.
Ohad Hageby IDC Support Vector Machines & Kernel Machines IP Seminar 2008 IDC Herzliya.
Linear Models for Classification
Support Vector Machines in Marketing Georgi Nalbantov MICC, Maastricht University.
Dec 21, 2006For ICDM Panel on 10 Best Algorithms Support Vector Machines: A Survey Qiang Yang, for ICDM 2006 Panel Partially.
CSE4334/5334 DATA MINING CSE4334/5334 Data Mining, Fall 2014 Department of Computer Science and Engineering, University of Texas at Arlington Chengkai.
Elements of Pattern Recognition CNS/EE Lecture 5 M. Weber P. Perona.
Goal of Learning Algorithms  The early learning algorithms were designed to find such an accurate fit to the data.  A classifier is said to be consistent.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Chapter 6. Classification and Prediction Classification by decision tree induction Bayesian classification Rule-based classification Classification by.
Combining multiple learners Usman Roshan. Decision tree From Alpaydin, 2010.
Incremental Reduced Support Vector Machines Yuh-Jye Lee, Hung-Yi Lo and Su-Yun Huang National Taiwan University of Science and Technology and Institute.
SUPPORT VECTOR MACHINES Presented by: Naman Fatehpuria Sumana Venkatesh.
Eco 6380 Predictive Analytics For Economists Spring 2016 Professor Tom Fomby Department of Economics SMU.
Minimal Kernel Classifiers Glenn Fung Olvi Mangasarian Alexander Smola Data Mining Institute University of Wisconsin - Madison Informs 2002 San Jose, California,
A Brief Introduction to Support Vector Machine (SVM) Most slides were from Prof. A. W. Moore, School of Computer Science, Carnegie Mellon University.
Day 17: Duality and Nonlinear SVM Kristin P. Bennett Mathematical Sciences Department Rensselaer Polytechnic Institute.
High resolution product by SVM. L’Aquila experience and prospects for the validation site R. Anniballe DIET- Sapienza University of Rome.
Support Vector Machine
Big data classification using neural network
Support vector machines
Semi-Supervised Clustering
CEE 6410 Water Resources Systems Analysis
Sparse Kernel Machines
Trees, bagging, boosting, and stacking
Regularized risk minimization
Pawan Lingras and Cory Butz
Machine Learning Week 1.
Support Vector Machines Introduction to Data Mining, 2nd Edition by
Support Vector Machines
Machine Learning Week 2.
Presented by Steven Lewis
COSC 4335: Other Classification Techniques
Support vector machines
COSC 4368 Machine Learning Organization
CS639: Data Management for Data Science
INTRODUCTION TO Machine Learning 3rd Edition
Minimal Kernel Classifiers
Presentation transcript:

Sparsity-Eager SVM for Audio Classification Kamelia Aryafar & Ali Shokoufandeh Department of Computer Science Drexel University, Philadelphia, PA Presented by Ali Shokoufandeh, 11 October 2014

Connection to Paul Kantor I have known Paul since my graduate school time at Rutgers. As a junior faculty, I was part of an NSF project managed by Paul. Paul was an unofficial mentor during my early Career as Assistant Professor. I also had the honor of collaborating with Paul on numerous research problem.

Goal Build a robust supervised audio classifier by fusing SVM and regression classifiers: – Avoid suffers from over-fitting – Obtain higher generalization on new test examples. – Provide scalability in terms of classification complexity. The subspace model assumption [Donoho 06]: – For a sufficiently large number of training instances per class, unseen instance lies on the subspace spanned by training examples. D. L. Donoho. Compressed sensing. IEEE Transactions on Information Theory, 52:1289–1306, 2006.

Support Vector Machine (SVM) SVM is a maximum margin threshold classifier, w ( x ) = sign( w t.x ), consistent with training examples. If the training data is not linearly separable, soft-margin can be used: – Define a Hinge Function: – Find w that minimizes the empirical loss (1)(1)

What is known:

SVM provides an efficient minimum margin classifier but it suffers from over fitting. SVM provides an efficient minimum margin classifier but it suffers from over fitting.

Sparse Regression We are given a matrix X of samples for k classes ( r sample per class). We would like to find a classifier for an unknown f : The sparse-approximation classifier will output its prediction as

Fusion: l 1 -SVM Change the objective function of l 2 -SVM while avoiding the curse of dimensionality and over- fitting:

Fusion: l 1 -SVM Change the objective function of l 2 -SVM while avoiding the curse of dimensionality and over- fitting: K. Aryafar, S. Jafarpour, and A. Shokoufandeh. “Automatic musical genre classification using sparsity-eager support vector machines”, ICPR 2012.

Fusion: l 1 -SVM Change the objective function of l 2 -SVM while avoiding the curse of dimensionality and over- fitting: Classifier:

Experiments: 1886 songs and is comprised of nine music genres: pop, rock, folk-country, alternative, jazz, electronic, blues, rap/hip-hop, and funk soul/R&B:

Experiments: 1886 songs and is comprised of nine music genres: pop, rock, folk-country, alternative, jazz, electronic, blues, rap/hip-hop, and funk soul/R&B:

Experiments: 1886 songs and is comprised of nine music genres: pop, rock, folk-country, alternative, jazz, electronic, blues, rap/hip-hop, and funk soul/R&B:

Experiments: 1886 songs and is comprised of nine music genres: pop, rock, folk-country, alternative, jazz, electronic, blues, rap/hip-hop, and funk soul/R&B:

Generalization The model can be generalized to multi-modal classification: K. Aryafar and A. Shokoufandeh. “Multimodal sparsity-eager support vector machines for music classification”, ICMLA th, 2014.

Generalization The model can be generalized to multi-modal classification:

Final Thoughts Fusing ideas form two distinct class of methods (in this case) made multi-class classification more accurate, and added scalability in terms of classification complexity.