Machine Learning” Lecture 1

Slides:



Advertisements
Similar presentations
CHAPTER 1: Introduction
Advertisements

Machine Learning CSE 681 CH1 - INTRODUCTION. INTRODUCTION TO Machine Learning 2nd Edition ETHEM ALPAYDIN © The MIT Press, 2010
Godfather to the Singularity
CHAPTER 2: Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2 Learning a Class from Examples.
INTRODUCTION TO MACHINE LEARNING David Kauchak CS 451 – Fall 2013.
An Overview of Machine Learning
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
MACHINE LEARNING 1. Introduction. What is Machine Learning? Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Need.
Introduction to Machine Learning Anjeli Singh Computer Science and Software Engineering April 28 th 2008.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CII504 Intelligent Engine © 2005 Irfan Subakti Department of Informatics Institute Technology of Sepuluh Nopember Surabaya - Indonesia.
Introduction to Machine Learning course fall 2007 Lecturer: Amnon Shashua Teaching Assistant: Yevgeny Seldin School of Computer Science and Engineering.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
CS157A Spring 05 Data Mining Professor Sin-Min Lee.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Part I: Classification and Bayesian Learning
INTRODUCTION TO Machine Learning 3rd Edition
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
MACHINE LEARNING 張銘軒 譚恆力 1. OUTLINE OVERVIEW HOW DOSE THE MACHINE “ LEARN ” ? ADVANTAGE OF MACHINE LEARNING ALGORITHM TYPES  SUPERVISED.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki 1 Introduction.
INTRODUCTION TO Machine Learning Adapted from: ETHEM ALPAYDIN Lecture Slides for.
Introduction to Machine Learning Supervised Learning 姓名 : 李政軒.
Machine Learning, Decision Trees, Overfitting Machine Learning Tom M. Mitchell Machine Learning Department Carnegie Mellon University January 14,
Machine Learning Extract from various presentations: University of Nebraska, Scott, Freund, Domingo, Hong,
Christoph Eick: Learning Models to Predict and Classify 1 Learning from Examples Example of Learning from Examples  Classification: Is car x a family.
Concept learning, Regression Adapted from slides from Alpaydin’s book and slides by Professor Doina Precup, Mcgill University.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
CHAPTER 1: Introduction. 2 Why “Learn”? Machine learning is programming computers to optimize a performance criterion using example data or past experience.
يادگيري ماشين Machine Learning Lecturer: A. Rabiee
Data Mining and Decision Support
MACHINE LEARNING 3. Supervised Learning. Learning a Class from Examples Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CMPS 142/242 Review Section Fall 2011 Adapted from Lecture Slides.
Usman Roshan Dept. of Computer Science NJIT
Brief Intro to Machine Learning CS539
Teck Chia Partner, Exponent.vc
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Who am I? Work in Probabilistic Machine Learning Like to teach 
Machine Learning for Computer Security
Eick: Introduction Machine Learning

CH. 2: Supervised Learning
CH. 1: Introduction 1.1 What is Machine Learning Example:
Christoph Eick: Learning Models to Predict and Classify 1 Learning from Examples Example of Learning from Examples  Classification: Is car x a family.
What is Pattern Recognition?
INTRODUCTION TO Machine Learning
Basic Intro Tutorial on Machine Learning and Data Mining
Machine Learning.
Overview of Machine Learning
Why Machine Learning Flood of data
ITEC323 Lecture 1.
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Data Mining, Machine Learning, Data Analysis, etc. scikit-learn
Supervised Learning Berlin Chen 2005 References:
Christoph F. Eick: A Gentle Introduction to Machine Learning
CS639: Data Management for Data Science
INTRODUCTION TO Machine Learning
Supervised machine learning: creating a model
Machine learning: What is it?
Derek Hoiem CS 598, Spring 2009 Jan 27, 2009
Usman Roshan Dept. of Computer Science NJIT
INTRODUCTION TO Machine Learning
INTRODUCTION TO Machine Learning 3rd Edition
Presentation transcript:

Machine Learning” Lecture 1 Dr. Alper Özpınar

Textbook Main Textbook: Introduction to Machine Learning - Ethem Alpaydın ( 3rd Edition ) Supportive Materila Neural Networks and Learning Machines - Simon Haykin Pattern Recognition and Machine Learning (Information Science and Statistics) - Christopher M. Bishop Machine Learning - Tom M. Mitchell December 28, 2018

Weekly Plan December 28, 2018 Lecture 1   Lecture 1 Introduction to Machine Learning, Supervised, Unsupervised, Reinforcement Learning Lecture 2 Classification, Regression, Clustering, Overfitting, Underfitting, Decision Trees, K-means Lecture 3 Statistical Learning, Histograms, Density Functions Lecture 4 Bayesian Decision Theory Lecture 5 Parametric Methods Lecture 6 Decision Risk, Maximum A Posteriori Estimation ,Maximum Likelihood Estimation, Bias Variance Lecture 7 Python and Machine Learning Tools Lecture 8 Support Vector Machines, Kernel Machines Lecture 9 Multiple Learners Boosting Bagging Cascading Lecture 10 Hidden Markov Models Lecture 11 Introduction to Artificial Neural Networks, Multilayer Perceptions Lecture 12 Introduction to Artificial Neural Networks, Backpropagation Lecture 13 Deep Learning Applications CNN - Convolutional Neural Networks Lecture 14 Deep Learning Applications RNN - Recurrent Neural Networks December 28, 2018

Artificial Intelligence and Machine Learning Artificial intelligence is a branch of computer science that aims to create intelligent machines. It has become an essential part of the technology industry. Research associated with artificial intelligence is highly technical and specialized. The core problems of artificial intelligence include programming computers for certain traits such as: Knowledge Reasoning Learning Problem solving Perception Planning Ability to manipulate and move objects December 28, 2018

Automation and Intelligence Rules “intelligence” Decisions December 28, 2018

Artificial Intelligence and Machine Learning December 28, 2018

Definition of Intelligence Intelligence is a biological system in living organisms. Brain, nerve system and nerve cells Human Brain has 19-23 Billion Neurons in the Cerebral cortex December 28, 2018

Number of Neurons Total / Cerabral Corteks Sponge (0/0) Rat ( 200 Million / 18 Million) Capuchin monkey ( 3.5 Billion / 650 Million) Human ( 85-100 Billion / 19-23 Billion) African Elephant ( 250 Billion / 11 Billion ) December 28, 2018

Number of Neurons and Artificial Intelligence December 28, 2018

A Little bit of History Turing Machine Enigma Machine Alan Turing-1912-1954 December 28, 2018

AI Hive December 28, 2018

AI and Machine Learning Business December 28, 2018

Nasıl Başlayabilirsiniz December 28, 2018

December 28, 2018

Big Data Widespread use of personal computers and wireless communication leads to “big data” We are both producers and consumers of data Data is not random, it has structure, e.g., customer behavior We need “big theory” to extract that structure from data for (a) Understanding the process (b) Making predictions for the future

Why “Learn” ? Machine learning is programming computers to optimize a performance criterion using example data or past experience. There is no need to “learn” to calculate payroll Learning is used when: Human expertise does not exist (navigating on Mars), Humans are unable to explain their expertise (speech recognition) Solution changes in time (routing on a computer network) Solution needs to be adapted to particular cases (user biometrics)

What We Talk About When We Talk About “Learning” Learning general models from a data of particular examples Data is cheap and abundant (data warehouses, data marts); knowledge is expensive and scarce. Example in retail: Customer transactions to consumer behavior: People who bought “Blink” also bought “Outliers” (www.amazon.com) Build a model that is a good and useful approximation to the data.

Data Mining Retail: Market basket analysis, Customer relationship management (CRM) Finance: Credit scoring, fraud detection Manufacturing: Control, robotics, troubleshooting Medicine: Medical diagnosis Telecommunications: Spam filters, intrusion detection Bioinformatics: Motifs, alignment Web mining: Search engines ...

What is Machine Learning? Optimize a performance criterion using example data or past experience. Role of Statistics: Inference from a sample Role of Computer science: Efficient algorithms to Solve the optimization problem Representing and evaluating the model for inference

Applications Association Supervised Learning Unsupervised Learning Classification Regression Unsupervised Learning Reinforcement Learning

Learning Associations Basket analysis: P (Y | X ) probability that somebody who buys X also buys Y where X and Y are products/services. Example: P ( chips | beer ) = 0.7

Classification Example: Credit scoring Differentiating between low-risk and high-risk customers from their income and savings Discriminant: IF income > θ1 AND savings > θ2 THEN low-risk ELSE high-risk

Classification: Applications Aka Pattern recognition Face recognition: Pose, lighting, occlusion (glasses, beard), make-up, hair style Character recognition: Different handwriting styles. Speech recognition: Temporal dependency. Medical diagnosis: From symptoms to illnesses Biometrics: Recognition/authentication using physical and/or behavioral characteristics: Face, iris, signature, etc Outlier/novelty detection:

Face Recognition Training examples of a person Test images ORL dataset, AT&T Laboratories, Cambridge UK

Regression Example: Price of a used car x : car attributes y : price y = g (x | q ) g ( ) model, q parameters y = wx+w0

Regression Applications Navigating a car: Angle of the steering Kinematics of a robot arm α1 α2 (x,y) α1= g1(x,y) α2= g2(x,y) Response surface design

Supervised Learning: Uses Prediction of future cases: Use the rule to predict the output for future inputs Knowledge extraction: The rule is easy to understand Compression: The rule is simpler than the data it explains Outlier detection: Exceptions that are not covered by the rule, e.g., fraud

Unsupervised Learning Learning “what normally happens” No output Clustering: Grouping similar instances Example applications Customer segmentation in CRM Image compression: Color quantization Bioinformatics: Learning motifs

Reinforcement Learning Learning a policy: A sequence of outputs No supervised output but delayed reward Credit assignment problem Game playing Robot in a maze Multiple agents, partial observability, ...

Learning a Class from Examples Class C of a “family car” Prediction: Is car x a family car? Knowledge extraction: What do people expect from a family car? Output: Positive (+) and negative (–) examples Input representation: x1: price, x2 : engine power

Training set X

Class C

Hypothesis class H Error of h on H

S, G, and the Version Space most specific hypothesis, S most general hypothesis, G h Î H, between S and G is consistent and make up the version space (Mitchell, 1997)

Margin Choose h with largest margin

VC Dimension N points can be labeled in 2N ways as +/– H shatters N if there exists h Î H consistent for any of these: VC(H ) = N An axis-aligned rectangle shatters 4 points only !

Probably Approximately Correct (PAC) Learning How many training examples N should we have, such that with probability at least 1 ‒ δ, h has error at most ε ? (Blumer et al., 1989) Each strip is at most ε/4 Pr that we miss a strip 1‒ ε/4 Pr that N instances miss a strip (1 ‒ ε/4)N Pr that N instances miss 4 strips 4(1 ‒ ε/4)N 4(1 ‒ ε/4)N ≤ δ and (1 ‒ x)≤exp( ‒ x) 4exp(‒ εN/4) ≤ δ and N ≥ (4/ε)log(4/δ)

Noise and Model Complexity Use the simpler one because Simpler to use (lower computational complexity) Easier to train (lower space complexity) Easier to explain (more interpretable) Generalizes better (lower variance - Occam’s razor)

Multiple Classes, Ci i=1,...,K Train hypotheses hi(x), i =1,...,K:

Regression

Model Selection & Generalization Learning is an ill-posed problem; data is not sufficient to find a unique solution The need for inductive bias, assumptions about H Generalization: How well a model performs on new data Overfitting: H more complex than C or f Underfitting: H less complex than C or f

Triple Trade-Off There is a trade-off between three factors (Dietterich, 2003): Complexity of H, c (H), Training set size, N, Generalization error, E, on new data As N­, E¯ As c (H)­, first E¯ and then E­

Cross-Validation To estimate generalization error, we need data unseen during training. We split the data as Training set (50%) Validation set (25%) Test (publication) set (25%) Resampling when there is few data

Dimensions of a Supervised Learner Model: Loss function: Optimization procedure:

Resources: Datasets UCI Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html Statlib: http://lib.stat.cmu.edu/

Teşekkürler