Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.

Slides:



Advertisements
Similar presentations
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Advertisements

Linear Classifiers (perceptrons)
Support Vector Machines
Machine learning continued Image source:
Indian Statistical Institute Kolkata
1 1)Introduction to Machine Learning 2)Learning problem 3)Learning system design IES 511 Machine Learning Dr. Türker İnce (Lecture notes by Prof. T. M.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
MACHINE LEARNING 1. Introduction. What is Machine Learning? Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Need.
Image classification Given the bag-of-features representations of images from different classes, how do we learn a model for distinguishing them?
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CES 514 – Data Mining Lecture 8 classification (contd…)
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
1 MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING By Kaan Tariman M.S. in Computer Science CSCI 8810 Course Project.
Machine Learning Motivation for machine learning How to set up a problem How to design a learner Introduce one class of learners (ANN) –Perceptrons –Feed-forward.
KNN, LVQ, SOM. Instance Based Learning K-Nearest Neighbor Algorithm (LVQ) Learning Vector Quantization (SOM) Self Organizing Maps.
Pattern Recognition. Introduction. Definitions.. Recognition process. Recognition process relates input signal to the stored concepts about the object.
Today Logistic Regression Decision Trees Redux Graphical Models
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Introduction to Machine Learning Approach Lecture 5.
INTRODUCTION TO Machine Learning 3rd Edition
Machine learning Image source:
Introduction to machine learning
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
Evaluating Performance for Data Mining Techniques
Machine learning Image source:
嵌入式視覺 Pattern Recognition for Embedded Vision Template matching Statistical / Structural Pattern Recognition Neural networks.
CSE 185 Introduction to Computer Vision Pattern Recognition.
Machine Learning. Learning agent Any other agent.
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Data Mining Joyeeta Dutta-Moscato July 10, Wherever we have large amounts of data, we have the need for building systems capable of learning information.
Data mining and machine learning A brief introduction.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
An Introduction to Support Vector Machine (SVM) Presenter : Ahey Date : 2007/07/20 The slides are based on lecture notes of Prof. 林智仁 and Daniel Yeung.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki 1 Introduction.
Hidden Markov Models in Keystroke Dynamics Md Liakat Ali, John V. Monaco, and Charles C. Tappert Seidenberg School of CSIS, Pace University, White Plains,
Classifiers Given a feature representation for images, how do we learn a model for distinguishing features from different classes? Zebra Non-zebra Decision.
1 E. Fatemizadeh Statistical Pattern Recognition.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
A Novel Local Patch Framework for Fixing Supervised Learning Models Yilei Wang 1, Bingzheng Wei 2, Jun Yan 2, Yang Hu 2, Zhi-Hong Deng 1, Zheng Chen 2.
Learning from observations
Machine Learning Documentation Initiative Workshop on the Modernisation of Statistical Production Topic iii) Innovation in technology and methods driving.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Chapter 11 Statistical Techniques. Data Warehouse and Data Mining Chapter 11 2 Chapter Objectives  Understand when linear regression is an appropriate.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
CSE 5331/7331 F'07© Prentice Hall1 CSE 5331/7331 Fall 2007 Machine Learning Margaret H. Dunham Department of Computer Science and Engineering Southern.
Machine Learning Overview Tamara Berg CS 560 Artificial Intelligence Many slides throughout the course adapted from Svetlana Lazebnik, Dan Klein, Stuart.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Semi-Supervised Clustering
Machine Learning overview Chapter 18, 21
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Eick: Introduction Machine Learning
Intro to Machine Learning
Introductory Seminar on Research: Fall 2017
CS 790 Machine Learning Introduction Ali Borji UWM.
AV Autonomous Vehicles.
Machine Learning Week 1.
Overview of Machine Learning
Machine Learning Introduction.
ITEC323 Lecture 1.
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
MACHINE LEARNING TECHNIQUES IN IMAGE PROCESSING
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Christoph F. Eick: A Gentle Introduction to Machine Learning
CS639: Data Management for Data Science
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John.
MIRA, SVM, k-NN Lirong Xia. MIRA, SVM, k-NN Lirong Xia.
Machine Learning.
Presentation transcript:

Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data

Machine Learning

Science is a systematic enterprise that builds and organizes Knowledge in the form of testable explanations and predictions about nature and universe. universe

Why “Learn” ? Machine learning is programming computers to optimize a performance criterion using example data or past experience. There is no need to “learn” to calculate payroll Learning is used when: –Human expertise does not exist (navigating on Mars), –Humans are unable to explain their expertise (speech recognition) –Solution changes in time (routing on a computer network) –Solution needs to be adapted to particular cases (user biometrics)

What is Machine Learning? Optimize a performance criterion using example data or past experience. Role of Statistics: Inference from a sample Role of Computer science: Efficient algorithms to –Solve the optimization problem –Representing and evaluating the model for inference

Apply a prediction function to a feature representation of the image to get the desired output: f( ) = “apple” f( ) = “tomato” f( ) = “cow”

y = f(x)

Prediction Training Labels Training Images Training Image Features Learned model

Unsupervised “Weakly” supervised Fully supervised

SVM Neural networks Naïve Bayes Logistic regression Decision Trees K-nearest neighbor RBMs Etc.

Definition of Learning A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience E.

Design a Learning System We shall use handwritten Character recognition as an example to illustrate the design issues and approaches

Step 0: –Lets treat the learning system as a black box Learning System Z

Step 1: Collect Training Examples (Experience). –Without examples, our system will not learn

Step 2: Representing Experience –Choose a representation scheme for the experience/examples The sensor input represented by an n-d vector, called the feature vector, X = (x 1, x 2, x 3, …, x n ) (1,1,0,1,1,1,1,1,1,1,0,0,0,0,1,1,1, 1,1,0, …., 1) (1,1,1,1,1,1,1,1,1,1,0,0,1,1,1,1,1, 1,1,0, …., 1)

Choose a representation scheme for the experience/examples The sensor input represented by an n-d vector, called the feature vector, X = (x1, x2, x3, …, xn) To represent the experience, we need to know what X is. So we need a corresponding vector D, which will record our knowledge (experience) about X The experience E is a pair of vectors E = (X, D)

–Assuming our system is to recognise 10 digits only, then D can be a 10-d binary vector; each correspond to one of the digits if X is digit 5, then d5=1; all others =0 if X is digit 9, then d9=1; all others =0 X = (1,1,0,1,1,1,1,1,1,1,0,0,0,0,1,1,1, 1,1,0, …., 1); D= (0,0,0,0,0,1,0,0,0,0) X= (1,1,1,1,1,1,1,1,1,1,0,0,1,1,1,1,1, 1,1,0, …., 1); D= (0,0,0,0,0,0,0,0,1,0) D = (d0, d1, d2, d3, d4, d5, d6, d7, d8, d9)

Step 3: Choose a Representation for the Black Box –We need to choose a function F to approximate the block box. For a given X, the value of F will give the classification of X. There are considerable flexibilities in choosing F Learning System F F(X) X

–F will be a function of some adjustable parameters, or weights, W = (w1, w2, w3, …w N ), which the learning algorithm can modify or learn Learning System F(W) F(W,X) X

Step 4: Learning/Adjusting the Weights We need a learning algorithm to adjust the weights such that the experience/prior knowledge from the training data can be learned into the system: E=(X,D) F(W,X) = D

Learning System F(W) F(W,X) X D E=(X,D) Error = D-F(W,X) Adjust W

Step 5: Use/Test the System –Once learning is completed, all parameters are fixed. An unknown input X is presented to the system, the system computes its answer according to F(W,X) Learning System F(W) F(W,X) X Answer

SVM Neural networks Naïve Bayes Logistic regression Decision Trees K-nearest neighbor RBMs Etc.

Machine Learning

Bayes Rule Thomas Bayes (c – 7 April 1761) was an English statistician, philosopher and Presbyterian minister, known for having formulated a specific case of the theorem that bears his name: Bayes' theorem. Bayes never published what would eventually become his most famous accomplishment; his notes were edited and published after his death by Richard Price. Machine Learning

Maximum Likelihood (ML) Machine Learning

The Euclidean Distance Classifier

Cell structures –Cell body –Dendrites –Axon –Synaptic terminals Machine Learning

Entropy (disorder, impurity) of a set of examples, S, relative to a binary classification is: where p 1 is the fraction of positive examples in S and p 0 is the fraction of negatives. Machine Learning

An SVM is an abstract learning machine which will learn from a training data set and attempt to generalize and make correct predictions on novel data.

Machine Learning Clustering: Partition unlabeled examples into disjoint subsets of clusters, such that: – Examples within a cluster are similar – Examples in different clusters are different

Machine Learning

Original data Machine Learning

(1) centering & whitening process Machine Learning

(2) FastICA algorithm Machine Learning

(2) FastICA algorithm Machine Learning

Thank you!