ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960

Slides:



Advertisements
Similar presentations
EigenFaces and EigenPatches Useful model of variation in a region –Region must be fixed shape (eg rectangle) Developed for face recognition Generalised.
Advertisements

Covariance Matrix Applications
R Lecture 4 Naomi Altman Department of Statistics Department of Statistics (Based on notes by J. Lee)
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 10b, April 10, 2015 Labs: Cross Validation, RandomForest, Multi- Dimensional Scaling, Dimension Reduction,
As applied to face recognition.  Detection vs. Recognition.
Principal Component Analysis
Lecture 16 Graphs and Matrices in Practice Eigenvalue and Eigenvector Shang-Hua Teng.
CSE 300: Software Reliability Engineering Topics covered: Software metrics and software reliability.
Bayesian belief networks 2. PCA and ICA
SVM Lab material borrowed from tutorial by David Meyer FH Technikum Wien, Austria see:
Summarized by Soo-Jin Kim
Principle Component Analysis Presented by: Sabbir Ahmed Roll: FH-227.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 7b, March 13, 2015 Interpreting weighted kNN, decision trees, cross-validation, dimension reduction.
Principal Component Analysis Bamshad Mobasher DePaul University Bamshad Mobasher DePaul University.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 10a, April 1, 2014 Support Vector Machines.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 10b, April 4, 2014 Lab: More on Support Vector Machines, Trees, and your projects.
Classification Course web page: vision.cis.udel.edu/~cv May 12, 2003  Lecture 33.
N– variate Gaussian. Some important characteristics: 1)The pdf of n jointly Gaussian R.V.’s is completely described by means, variances and covariances.
Why to reduce the number of the features? Having D features, we want to reduce their number to n, where n
Chapter 7 Multivariate techniques with text Parallel embedded system design lab 이청용.
1 Peter Fox Data Analytics – ITWS-4963/ITWS-6965 Week 11a, April 7, 2014 Support Vector Machines, Decision Trees, Cross- validation.
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
SVM Lab material borrowed from tutorial by David Meyer FH Technikum Wien, Austria see:
Linear Algebra Diyako Ghaderyan 1 Contents:  Linear Equations in Linear Algebra  Matrix Algebra  Determinants  Vector Spaces  Eigenvalues.
Principal Component Analysis Zelin Jia Shengbin Lin 10/20/2015.
1 Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 7a, March 8, 2016 Decision trees, cross-validation.
1 Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 11a, April 12, 2016 Interpreting: MDS, DR, SVM Factor Analysis; and Boosting.
Principal Component Analysis (PCA)
Using the models, prediction, deciding
Information Management course
COMP 1942 PCA TA: Harry Chan COMP1942.
9.3 Filtered delay embeddings
Classification, Clustering and Bayes…
Information Management course
Interpreting: MDS, DR, SVM Factor Analysis
Computer courses in Chandigarh. Computer System Components.
Labs: Dimension Reduction, Factor Analysis
Labs: Dimension Reduction, Factor Analysis
Labs: Dimension Reduction, Multi-dimensional Scaling, SVM
Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 9b, April 1, 2016
Principal Component Analysis
Measuring latent variables
Peter Fox and Greg Hughes Data Analytics – ITWS-4600/ITWS-6600
Labs: Dimension Reduction, Multi-dimensional Scaling, SVM
Interpreting: MDS, DR, SVM Factor Analysis
Bayesian belief networks 2. PCA and ICA
Principal Component Analysis
Principal Component Analysis (PCA)
Measuring latent variables
Outline H. Murase, and S. K. Nayar, “Visual learning and recognition of 3-D objects from appearance,” International Journal of Computer Vision, vol. 14,
CSCI N207 Data Analysis Using Spreadsheet
Measuring latent variables
Classification, Clustering and Bayes…
Data Analytics – ITWS-4600/ITWS-6600/MATP-4450
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Local Regression, LDA, and Mixed Model Lab
Interpreting: MDS, DR, SVM Factor Analysis
Labs: Trees, Dimension Reduction, Multi-dimensional Scaling, SVM
Lab weighted kNN, decision trees, random forest (“cross-validation” built in – more labs on it later in the course) Peter Fox and Greg Hughes Data Analytics.
Digital Image Processing Lecture 21: Principal Components for Description Prof. Charlene Tsai *Chapter 11.4 of Gonzalez.
Peter Fox Data Analytics – ITWS-4600/ITWS-6600 Week 10b, April 8, 2016
Cross-validation and Local Regression Lab
Cross-validation and Local Regression Lab
Factor Analysis (Principal Components) Output
Classification, Clustering and Bayes…
Local Regression, LDA, and Mixed Model Lab
Data Analytics – ITWS-4600/ITWS-6600/MATP-4450
ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960
Measuring latent variables
Data Analytics course.
Presentation transcript:

ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960 Labs: (SVM, Multi-Dimensional Scaling, Dimension Reduction), Factor Analysis, Random Forest Peter Fox Data Analytics ITWS-4600/ITWS-6600/MATP-4450/CSCI-4960 Group 3 Lab 2, November 2, 2018

If you did not complete svm group3/ lab1_svm{1,11}.R lab1_svm{12,13}.R lab1_svm_rpart1.R

And MDS, DR lab1_mds{1,3}.R lab1_dr{1,4}.R http://www.statmethods.net/advstats/mds.html http://gastonsanchez.com/blog/how-to/2013/01/23/MDS-in-R.html

randomForest > library(e1071) > library(rpart) > library(mlbench) # etc. > data(kyphosis) > require(randomForest) # or library(randomForest) > fitKF <- randomForest(Kyphosis ~ Age + Number + Start, data=kyphosis) > print(fitKF) # view results > importance(fitKF) # importance of each predictor # what else can you do? data(swiss) # fertility? lab2_rf1.R data(Glass,package=“mlbench”) # Type ~ <what>? data(Titanic) # Survived ~ . Find - Mileage~Price + Country + Reliability + Type

Try these example_exploratoryFactorAnalysis.R on dataset_exploratoryFactorAnalysis.csv (on website) http://rtutorialseries.blogspot.com/2011/10/r-tutorial-series-exploratory-factor.html (this was the example on courses in the lecture) http://www.statmethods.net/advstats/factor.html http://stats.stackexchange.com/questions/1576/what-are-the-differences-between-factor-analysis-and-principal-component-analysi Do these – lab2_fa{1,2,4,5}.R

Factor Analysis data(iqitems) # data(ability) ability.irt <- irt.fa(ability) ability.scores <- score.irt(ability.irt,ability) data(attitude) cor(attitude) # Compute eigenvalues and eigenvectors of the correlation matrix. pfa.eigen<-eigen(cor(attitude)) pfa.eigen$values # set a value for the number of factors (for clarity) factors<-2 # Extract and transform two components. pfa.eigen$vectors [ , 1:factors ] %*% + diag ( sqrt (pfa.eigen$values [ 1:factors ] ),factors,factors )

Glass index <- 1:nrow(Glass) testindex <- sample(index, trunc(length(index)/3)) testset <- Glass[testindex,] trainset <- Glass[-testindex,] Cor(testset) Factor Analysis?