Oral Presentation Applied Machine Learning Course YOUR NAME

Slides:



Advertisements
Similar presentations
Machine Learning Homework
Advertisements

Unsupervised Learning Clustering K-Means. Recall: Key Components of Intelligent Agents Representation Language: Graph, Bayes Nets, Linear functions Inference.
Supervised and Unsupervised learning and application to Neuroscience Cours CA6b-4.
Lecture 4 Unsupervised Learning Clustering & Dimensionality Reduction
Unsupervised Learning
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
ML ALGORITHMS. Algorithm Types Classification (supervised) Given -> A set of classified examples “instances” Produce -> A way of classifying new examples.
Principal Component Analysis. Philosophy of PCA Introduced by Pearson (1901) and Hotelling (1933) to describe the variation in a set of multivariate data.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CSC 478 Programming Data Mining Applications Course Summary Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Dimensionality reduction Usman Roshan CS 675. Supervised dim reduction: Linear discriminant analysis Fisher linear discriminant: –Maximize ratio of difference.
ICS 178 Introduction Machine Learning & data Mining Instructor max Welling Lecture 6: Logistic Regression.
es/by-sa/2.0/. Principal Component Analysis & Clustering Prof:Rui Alves Dept Ciencies Mediques.
BAGGING ALGORITHM, ONLINE BOOSTING AND VISION Se – Hoon Park.
CSE 185 Introduction to Computer Vision Face Recognition.
Principles of Training. Physical Fitness Training Programme Frequency Frequency Specificity – Progressive Overload - Intensity Duration Duration.
Project by: Cirill Aizenberg, Dima Altshuler Supervisor: Erez Berkovich.
Reduces time complexity: Less computation Reduces space complexity: Less parameters Simpler models are more robust on small datasets More interpretable;
Clustering Instructor: Max Welling ICS 178 Machine Learning & Data Mining.
CSE 446 Dimensionality Reduction and PCA Winter 2012 Slides adapted from Carlos Guestrin & Luke Zettlemoyer.
Machine Learning Saarland University, SS 2007 Holger Bast Marjan Celikik Kevin Chang Stefan Funke Joachim Giesen Max-Planck-Institut für Informatik Saarbrücken,
Over-fitting and Regularization Chapter 4 textbook Lectures 11 and 12 on amlbook.com.
MACHINE LEARNING 7. Dimensionality Reduction. Dimensionality of input Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
ECE 471/571 - Lecture 19 Review 11/12/15. A Roadmap 2 Pattern Classification Statistical ApproachNon-Statistical Approach SupervisedUnsupervised Basic.
Machine Learning ICS 178 Instructor: Max Welling Supervised Learning.
CSC 478 Programming Data Mining Applications Course Summary Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
Market analysis for the S&P500 Giulio Genovese Tuesday, December
Extending linear models by transformation (section 3.4 in text) (lectures 3&4 on amlbook.com)
Face detection and recognition Many slides adapted from K. Grauman and D. Lowe.
Multivariate statistical methods. Multivariate methods multivariate dataset – group of n objects, m variables (as a rule n>m, if possible). confirmation.
Images were sourced from the following web sites: Slide 2:commons.wikimedia.org/wiki/File:BorromeanRing...commons.wikimedia.org/wiki/File:BorromeanRing...
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Machine Learning Supervised Learning Classification and Regression K-Nearest Neighbor Classification Fisher’s Criteria & Linear Discriminant Analysis Perceptron:
Usman Roshan Dept. of Computer Science NJIT
Principal Component Analysis
PREDICT 422: Practical Machine Learning
Semi-Supervised Clustering
ECE 471/571 - Lecture 19 Review 02/24/17.
What is your Region in Alberta?
Introduction to Data Mining
University of Ioannina
Goodfellow: Chap 5 Machine Learning Basics
Dimensionality reduction
continued on next slide
Principal Components Analysis
Basic machine learning background with Python scikit-learn
Machine Learning Dimensionality Reduction
                                                                                                                                                                                                                                                
FUNDAMENTALS OF MACHINE LEARNING AND DEEP LEARNING
Machine Learning Week 1.
continued on next slide
continued on next slide
and 2 units to the left of point B. Count down 5 units from point B
Machine Learning Course.
Dimension reduction : PCA and Clustering
Dimensionality Reduction
Tuning CNN: Tips & Tricks
Classification Boundaries
CS4670: Intro to Computer Vision
Nearest Neighbors CSC 576: Data Mining.
Concave Minimization for Support Vector Machine Classifiers
Literature and methods Amir Tavanaei Dr. Vijay Raghavan
Process – Structure Linkage for Cellulose NanoCyrstals (CNC)
MAS 622J Course Project Classification of Affective States - GP Semi-Supervised Learning, SVM and kNN Hyungil Ahn
Usman Roshan Dept. of Computer Science NJIT
Introduction to Scientific Computing
Midterm Exam Review.
continued on next slide
Introduction to Machine learning
ECE – Pattern Recognition Midterm Review
Presentation transcript:

Oral Presentation Applied Machine Learning Course YOUR NAME

DATASET Describe your dataset: nm of datapoints, dimension of data, classes. Examples of images from each class

PCA Results (2 slides) Describe results of applying PCA on your dataset Show the best projections Show / discuss the eigenvalues Show /discuss eigenvectors

Result of Clustering (2 slides) Show best and worst clustering results Show results with semi-supervised clustering Discuss effect of hyperparameters

Result of Classification (2 slides) Show results of classification Compare the classifiers Discuss effect of hyperparameters Add qualitative plots to highlight some results, such as cases of overfitting.

Result of Regression (2-3 slides) Explain metric chosen and show example of metric values for your images Show results of regression Compare the regression algorithms Discuss effect of hyperparameters Add qualitative plots to highlight some results, such as cases of poor fit.