INTRODUCTION TO Machine Learning

Slides:



Advertisements
Similar presentations
CHAPTER 1: Introduction
Advertisements

1 Machine Learning: Lecture 1 Overview of Machine Learning (Based on Chapter 1 of Mitchell T.., Machine Learning, 1997)
Introduction to Machine Learning BITS C464/BITS F464
Machine Learning CSE 681 CH1 - INTRODUCTION. INTRODUCTION TO Machine Learning 2nd Edition ETHEM ALPAYDIN © The MIT Press, 2010
Godfather to the Singularity
INTRODUCTION TO MACHINE LEARNING David Kauchak CS 451 – Fall 2013.
CS 484 – Artificial Intelligence1 Announcements Project 1 is due Tuesday, October 16 Send me the name of your konane bot Midterm is Thursday, October 18.
1er. Escuela Red ProTIC - Tandil, de Abril, Introduction How to program computers to learn? Learning: Improving automatically with experience.
C SC 421: Artificial Intelligence …or Computational Intelligence Alex Thomo
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
MACHINE LEARNING 1. Introduction. What is Machine Learning? Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1) 2  Need.
Introduction to Machine Learning Anjeli Singh Computer Science and Software Engineering April 28 th 2008.
CIS 678 Artificial Intelligence problems deduction, reasoning knowledge representation planning learning natural language processing motion and manipulation.
Machine Learning CSE 473. © Daniel S. Weld Topics Agency Problem Spaces Search Knowledge Representation Reinforcement Learning InferencePlanning.
1 Hidden Markov Model Instructor : Saeed Shiry  CHAPTER 13 ETHEM ALPAYDIN © The MIT Press, 2004.
Introduction to Machine Learning course fall 2007 Lecturer: Amnon Shashua Teaching Assistant: Yevgeny Seldin School of Computer Science and Engineering.
Learning Programs Danielle and Joseph Bennett (and Lorelei) 4 December 2007.
CS157A Spring 05 Data Mining Professor Sin-Min Lee.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Part I: Classification and Bayesian Learning
INTRODUCTION TO Machine Learning 3rd Edition
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
CS Machine Learning. What is Machine Learning? Adapt to / learn from data  To optimize a performance function Can be used to:  Extract knowledge.
MACHINE LEARNING 張銘軒 譚恆力 1. OUTLINE OVERVIEW HOW DOSE THE MACHINE “ LEARN ” ? ADVANTAGE OF MACHINE LEARNING ALGORITHM TYPES  SUPERVISED.
Mehdi Ghayoumi Kent State University Computer Science Department Summer 2015 Exposition on Cyber Infrastructure and Big Data.
CpSc 810: Machine Learning Design a learning system.
CpSc 881: Machine Learning Introduction. 2 Copy Right Notice Most slides in this presentation are adopted from slides of text book and various sources.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Lecture 10: 8/6/1435 Machine Learning Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Yazd University, Electrical and Computer Engineering Department Course Title: Advanced Software Engineering By: Mohammad Ali Zare Chahooki 1 Introduction.
INTRODUCTION TO Machine Learning Adapted from: ETHEM ALPAYDIN Lecture Slides for.
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
1 Machine Learning 1.Where does machine learning fit in computer science? 2.What is machine learning? 3.Where can machine learning be applied? 4.Should.
Learning from observations
CS157B Fall 04 Introduction to Data Mining Chapter 22.3 Professor Lee Yu, Jianji (Joseph)
Machine Learning Extract from various presentations: University of Nebraska, Scott, Freund, Domingo, Hong,
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 9 of 42 Wednesday, 14.
Chapter 1: Introduction. 2 목 차목 차 t Definition and Applications of Machine t Designing a Learning System  Choosing the Training Experience  Choosing.
CHAPTER 1: Introduction. 2 Why “Learn”? Machine learning is programming computers to optimize a performance criterion using example data or past experience.
يادگيري ماشين Machine Learning Lecturer: A. Rabiee
Data Mining and Decision Support
1 Introduction to Machine Learning Chapter 1. cont.
Introduction Machine Learning: Chapter 1. Contents Types of learning Applications of machine learning Disciplines related with machine learning Well-posed.
Introduction to Machine Learning © Roni Rosenfeld,
Well Posed Learning Problems Must identify the following 3 features –Learning Task: the thing you want to learn. –Performance measure: must know when you.
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Machine Learning Usman Roshan Dept. of Computer Science NJIT.
Machine Learning. Definition: The ability of a machine to improve its performance based on previous results.
Supervise Learning Introduction. What is Learning Problem Learning = Improving with experience at some task – Improve over task T, – With respect to performance.
Prepared by Fayes Salma.  Introduction: Financial Tasks  Data Mining process  Methods in Financial Data mining o Neural Network o Decision Tree  Trading.
Usman Roshan Dept. of Computer Science NJIT
Brief Intro to Machine Learning CS539
Introduction to Machine Learning
Eick: Introduction Machine Learning

Spring 2003 Dr. Susan Bridges
Done Done Course Overview What is AI? What are the Major Challenges?
CH. 1: Introduction 1.1 What is Machine Learning Example:
CS 790 Machine Learning Introduction Ali Borji UWM.
What is Pattern Recognition?
Basic Intro Tutorial on Machine Learning and Data Mining
Machine Learning.
Machine Learning” Lecture 1
Overview of Machine Learning
Why Machine Learning Flood of data
ITEC323 Lecture 1.
Christoph F. Eick: A Gentle Introduction to Machine Learning
Machine learning: What is it?
Usman Roshan Dept. of Computer Science NJIT
Machine Learning (ML) and Knowledge Discovery in Databases (KDD)
Presentation transcript:

INTRODUCTION TO Machine Learning Lecture Slides for INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, 2004 alpaydin@boun.edu.tr http://www.cmpe.boun.edu.tr/~ethem/i2ml

CHAPTER 1: Introduction

What Is Machine Learning? Machine learning is programming computers to optimize a performance criterion using example data or past experience. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Why do we need Machine Learning? With advance in computer technology, we currently have the ability to store and process large amounts of data, as well as to access it from physically distinct locations over a computer network. Google=> Special Mail List Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Data Mining Application of machine learning methods to large databases is called data mining. Finance banks credit applications. Fraud detection. Stock market. Manufacturing Optimization Control Troubleshooting Medicine, telecommunications and World Wide Web. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Artificial Intelligence Machine learning is not just a database problem; it is also a part of artificial intelligence. To be intelligent, a system that is a changing environment should have ability to learn. If the system can learn and adapt to such changes, the system designer need not foresee and provide solutions for all possible situations. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

A.I.人工智慧 導演: 史蒂芬史匹柏 大衛是一個實驗機器男孩,也是第一個能夠付出真愛的機器人,由一名虛擬電子公司員工和他太太志願領養做為實驗。擁有真愛的機器人無法被人類和機器人接受,大衛不得已帶著他的玩具熊泰迪,踏上尋找自己的定位之旅,終於發現人類和機器之間,存在著無法跨越的鴻溝,但卻也有共通可能,漸漸地大衛被他們視為己出,也獲得寵愛,但是發生一連串意外之後,他卻無法繼續溫暖的家庭生活... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Pattern Recognition Recognizing faces: This is a task we do effortlessly; every day we recognize family members and friends by looking at their faces or from their faces or from their photographs, despite difference in pose, lighting, hair style, and so forth. But we do it unconsciously and are unable to explain how we can do it. Because we are not able to explain our expertise, we cannot write the computer program. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Examples of Machine Learning Applications Learning Associations Classification (supervised) Regression (supervised) Unsupervised Learning Reinforcement Learning Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Associations Association rules P(chips|beer) = 0.7 70 percent of customers who buy beer also buy chips. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Classification Classification Discrimination Prediction Saving vs. Income => Low-risk/high-risk. Prediction Pattern Recognition Optical character recognition. Face recognition Medical diagnosis Speech recognition Knowledge Extraction Compression Outlier detection Fraud. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Regression Regression Supervised Learning. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Supervised Learning. In supervised learning, the aim is to learn a mapping from the input to an output whose correct values are provided by a supervisor. In unsupervised learning, there is no such supervisor and we only have input of data. The aim is to find the regularities in the input. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Unsupervised Learning. In unsupervised learning, there is no such supervisor and we only have input of data. The aim is to find the regularities in the input. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Unsupervised Learning Density Estimation In Statistics. Clustering Image compression Bioinformatics DNA=>RNA=>Protein DNA in our genome is the “blueprint of life”, and is a sequence of bases, namely, A, G, C, and T. Motif. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Reinforcement Learning In such a case, a single action is not important; What is important is the policy that is the sequence of correct actions to reach the goal. Game playing Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Why “Learn” ? Machine learning is programming computers to optimize a performance criterion using example data or past experience. There is no need to “learn” to calculate payroll Learning is used when: Human expertise does not exist (navigating on Mars), Humans are unable to explain their expertise (speech recognition) Solution changes in time (routing on a computer network) Solution needs to be adapted to particular cases (user biometrics) Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

What We Talk About When We Talk About“Learning” Learning general models from a data of particular examples Data is cheap and abundant (data warehouses, data marts); knowledge is expensive and scarce. Example in retail: Customer transactions to consumer behavior: People who bought “Da Vinci Code” also bought “The Five People You Meet in Heaven” (www.amazon.com) Build a model that is a good and useful approximation to the data. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Data Mining Retail: Market basket analysis, Customer relationship management (CRM) Finance: Credit scoring, fraud detection Manufacturing: Optimization, troubleshooting Medicine: Medical diagnosis Telecommunications: Quality of service optimization Bioinformatics: Motifs, alignment Web mining: Search engines ... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

What is Machine Learning? Optimize a performance criterion using example data or past experience. Role of Statistics: Inference from a sample Role of Computer science: Efficient algorithms to Solve the optimization problem Representing and evaluating the model for inference Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Definition: A computer program is said to learn from experience E with respect to some class of tasks T and performance measure P, if its performance at tasks in T, as measured by P, improves with experience. Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Problem Learning: improving with experience at some task Improve over task T With respect to performance measure P Based on experience E Example: Learn to play checkers: T: play checkers P: percentage of games won in a tournament E: opportunity to play against itself Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Problem Learning: improving with experience at some task Improve over task T With respect to performance measure P Based on experience E Example: Learn to play checkers: T: play checkers P: percentage of games won in a tournament E: opportunity to play against itself Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning to play checkers T: play checkers P: percentage of games won What experience? What exactly should be learned? How shall it be represented? What specific algorithm to learn it? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Type of Training Experience Direct or indirect? Direct: board state -> correct move Indirect: outcome of a complete game Credit assignment problem Teacher or not ? Teacher selects board states Learner can select board states Is training experience representative of performance goal? Training playing against itself Performance evaluated playing against world champion Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Choose Target Function ChooseMove : B  M : board state  move Maps a legal board state to a legal move Evaluate : BV : board state  board value Assigns a numerical score to any given board state, such that better board states obtain a higher score Select the best move by evaluating all successor states of legal moves and pick the one with the maximal score Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Possible Definition of Target Function If b is a final board state that is won then V(b) = 100 If b is a final board state that is lost then V(b) = -100 If b is a final board state that is drawn then V(b)=0 If b is not a final board state, then V(b)=V(b’), where b’ is the best final board state that can be achieved starting from b and playing optimally until the end of the game. Gives correct values but is not operational Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

State Space Search V(b)= ? V(b)= maxi V(bi) m3 : bb3 m2 : bb2 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

State Space Search V(b1)= ? V(b1)= mini V(bi) m6 : bb6 m5 : bb5 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Final Board States Black wins: V(b)=-100 Red wins: V(b)=100 draw: V(b)=0 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Depth-First Search Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Breadth-First Search Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Number of Board States Tic-Tac-Toe: 4 x 4 checkers: (no queens) … + 9!/(2! 4! 3!) + … 9 = 6045 4 x 4 checkers: (no queens) #board states = ? #board states < 8x7x6x5*22/(2!*2!) = 1680 Regular checkers (8x8 board, 8 pieces each) #board states < 32!*216/(8! * 8! * 16!) = 5.07*1017 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Choose Representation of Target Function Table look-up Collection of rules Neural networks Polynomial function of board features Trade-off in choosing an expressive representation: Approximation accuracy Number of training examples to learn the target function Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Determine Representation Design Choices Determine Type of Training Experience Games against experts Games against self Table of correct moves BoardMove Determine Target Function BoardValue Determine Representation of Learned Function polynomial Linear function of six features Artificial neural network Determine Learning Algorithm Gradient descent Linear programming Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Distribution of Applicants Good customers Bad customers Cw=38 Assume we want to minimize classification error: What is the optimal decision boundary? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Distribution of Accepted Customers Cw=43 Good customers Bad customers What is the optimal decision boundary? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Target Function Customer record: income, owns house, credit history, age, employed, accept $40000, yes, good, 38, full-time, yes $25000, no, excellent, 25, part-time, no $50000, no, poor, 55, unemployed, no T: Customer data  accept/reject T: Customer data  probability good customer T: Customer data  expected utility/profit Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning methods Decision rules: Bayesian network: Neural Network: If income < $30.000 then reject Bayesian network: P(good | income, credit history,….) Neural Network: Nearest Neighbor: Take the same decision as for the customer in the data base that is most similar to the applicant Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Problem Examples Obstacle Avoidance Behavior of a Mobile Robot Task T: Navigate robot safely through an environment. Performance measure P : ? Experience E : ? Target function : ? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Performance Measure P: P: Maximize time until collision with obstacle P: Maximize distance travelled until collision with obstacle P: Minimize rotational velocity, maximize translational velocity P: Minimize error between control action of a human operator and robot controller in the same situation Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Training Experience E: Direct: Monitor human operator and use her control actions as training data: E = { <perceptioni, actioni>} Indirect: Operate robot in the real world or in a simulation. Reward desirable states, penalize undesirable states V(b) = +1 if v > 0.5 m/s V(b) = +2 if  < 10 deg/s V(b) = -100 if bumper state = 1 Question: Internal or external reward ? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Target Function Choose action: A: perception  action Sonar readings: s1(t)…sn(t)  <v,> Evaluate perception/state: V: s1(t)…sn(t)  V(s1(t)…sn(t)) Problem: states are only partially observable therefore world seems non-deterministic Markov Decision Process : successor state s(t+1) is a probabilistic function of current state s(t) and action a(t) Evaluate state/action pairs: V: s1(t)…sn(t), a(t)  V(s1(t)…sn(t),a(t)) Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Methods Neural Networks Reinforcement Learning Require direct training experience Reinforcement Learning Indirect training experience Evolutionary Algorithms Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Evolutionary Algorithms 10011 01001 mutation 00111 11001 10001 01011 population of genotypes 10111 01001 x f phenotype space coding scheme fitness selection 11001 10001 01011 recombination 10001 01011 10011 01001 10 01 001 011 10 01 001 011 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Applications Association Supervised Learning Unsupervised Learning Classification Regression Unsupervised Learning Reinforcement Learning Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning Associations Basket analysis: P (Y | X ) probability that somebody who buys X also buys Y where X and Y are products/services. Example: P ( chips | beer ) = 0.7 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Classification Example: Credit scoring Differentiating between low-risk and high-risk customers from their income and savings Discriminant: IF income > θ1 AND savings > θ2 THEN low-risk ELSE high-risk Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Classification: Applications Aka Pattern recognition Face recognition: Pose, lighting, occlusion (glasses, beard), make-up, hair style Character recognition: Different handwriting styles. Speech recognition: Temporal dependency. Use of a dictionary or the syntax of the language. Sensor fusion: Combine multiple modalities; eg, visual (lip image) and acoustic for speech Medical diagnosis: From symptoms to illnesses ... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Face Recognition Training examples of a person Test images AT&T Laboratories, Cambridge UK http://www.uk.research.att.com/facedatabase.html Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Regression Example: Price of a used car x : car attributes y : price y = g (x | θ ) g ( ) model, θ parameters y = wx+w0 Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Regression Applications Navigating a car: Angle of the steering wheel (CMU NavLab) Kinematics of a robot arm α1 α2 (x,y) α1= g1(x,y) α2= g2(x,y) Response surface design Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Supervised Learning: Uses Prediction of future cases: Use the rule to predict the output for future inputs Knowledge extraction: The rule is easy to understand Compression: The rule is simpler than the data it explains Outlier detection: Exceptions that are not covered by the rule, e.g., fraud Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Unsupervised Learning Learning “what normally happens” No output Clustering: Grouping similar instances Example applications Customer segmentation in CRM Image compression: Color quantization Bioinformatics: Learning motifs Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Reinforcement Learning Learning a policy: A sequence of outputs No supervised output but delayed reward Credit assignment problem Game playing Robot in a maze Multiple agents, partial observability, ... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Scout Robots 16 Sonar sensors Laser range scanner Odometry Differential drive Simulator API in C Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

LEGO Mindstorms Touch sensor Light sensor Rotation sensor Video cam Motors Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Issues in Machine Learning What algorithms can approximate functions well and when? How does the number of training examples influence accuracy? How does the complexity of hypothesis representation impact it? How does noisy data influence accuracy? What are the theoretical limits of learnability? Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Machine vs. Robot Learning Embedded learning Data distribution not homegeneous Mostly on-line Qualitative and sparse feed-back Time is crucial Hardware is a priority Empirical proof Machine Learning Learning in vaccum Statistically well-behaved data Mostly off-line Informative feed-back Computational time not an issue Hardware does not matter Convergence proof Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Learning in Robotics behavioral adaptation: adjust the parameters of individual behaviors according to some direct feedback signal (e.g. adaptive control) evolutionary adaptation: application of artificial evolution to robotic systems sensor adaptation: adopt the perceptual system to the environment (e.g. classification of different contexts, recognition) learning complex, deliberative behaviors: unsupervised learning based on sparse feedback from the environment, credit assignment problem (e.g. reinforcement learning) Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Resources: Datasets UCI Repository: http://www.ics.uci.edu/~mlearn/MLRepository.html UCI KDD Archive: http://kdd.ics.uci.edu/summary.data.application.html Statlib: http://lib.stat.cmu.edu/ Delve: http://www.cs.utoronto.ca/~delve/ Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Resources: Journals Journal of Machine Learning Research www.jmlr.org Neural Computation Neural Networks IEEE Transactions on Neural Networks IEEE Transactions on Pattern Analysis and Machine Intelligence Annals of Statistics Journal of the American Statistical Association ... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)

Resources: Conferences International Conference on Machine Learning (ICML) ICML05: http://icml.ais.fraunhofer.de/ European Conference on Machine Learning (ECML) ECML05: http://ecmlpkdd05.liacc.up.pt/ Neural Information Processing Systems (NIPS) NIPS05: http://nips.cc/ Uncertainty in Artificial Intelligence (UAI) UAI05: http://www.cs.toronto.edu/uai2005/ Computational Learning Theory (COLT) COLT05: http://learningtheory.org/colt2005/ International Joint Conference on Artificial Intelligence (IJCAI) IJCAI05: http://ijcai05.csd.abdn.ac.uk/ International Conference on Neural Networks (Europe) ICANN05: http://www.ibspan.waw.pl/ICANN-2005/ ... Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)