Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, 11-12 a Machine Learning.

Slides:



Advertisements
Similar presentations
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Advertisements

Neural Network I Week 7 1. Team Homework Assignment #9 Read pp. 327 – 334 and the Week 7 slide. Design a neural network for XOR (Exclusive OR) Explore.
Artificial Neural Network
Data Mining Classification: Alternative Techniques
Artificial Neural Networks - Introduction -
Artificial Neural Networks - Introduction -
Machine Learning Neural Networks
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
INTRODUCTION TO Machine Learning ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Introduction CS/CMPE 537 – Neural Networks. CS/CMPE Neural Networks (Sp 2004/2005) - Asim LUMS2 Biological Inspiration The brain is a highly.
20.5 Nerual Networks Thanks: Professors Frank Hoffmann and Jiawei Han, and Russell and Norvig.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
Rutgers CS440, Fall 2003 Neural networks Reading: Ch. 20, Sec. 5, AIMA 2 nd Ed.
Connectionist Modeling Some material taken from cspeech.ucd.ie/~connectionism and Rich & Knight, 1991.
AN INTERACTIVE TOOL FOR THE STOCK MARKET RESEARCH USING RECURSIVE NEURAL NETWORKS Master Thesis Michal Trna
Introduction to Neural Networks John Paxton Montana State University Summer 2003.
Artificial Neural Networks
MACHINE LEARNING 12. Multilayer Perceptrons. Neural Networks Lecture Notes for E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Machine Learning. Learning agent Any other agent.
Neurons, Neural Networks, and Learning 1. Human brain contains a massively interconnected net of (10 billion) neurons (cortical cells) Biological.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Intelligence Lecture No. 28 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Classification, Part 2 BMTRY 726 4/11/14. The 3 Approaches (1) Discriminant functions: -Find function f(x) that maps each point x directly into class.
1 CS 343: Artificial Intelligence Neural Networks Raymond J. Mooney University of Texas at Austin.
C. Benatti, 3/15/2012, Slide 1 GA/ICA Workshop Carla Benatti 3/15/2012.
Artificial Neural Nets and AI Connectionism Sub symbolic reasoning.
Introduction to Neural Networks. Neural Networks in the Brain Human brain “computes” in an entirely different way from conventional digital computers.
Parallel Artificial Neural Networks Ian Wesley-Smith Frameworks Division Center for Computation and Technology Louisiana State University
Chapter 9 Neural Network.
Artificial Neural Network Yalong Li Some slides are from _24_2011_ann.pdf.
Artificial Neural Networks. Applied Problems: Image, Sound, and Pattern recognition Decision making  Knowledge discovery  Context-Dependent Analysis.
1 Machine Learning The Perceptron. 2 Heuristic Search Knowledge Based Systems (KBS) Genetic Algorithms (GAs)
NEURAL NETWORKS FOR DATA MINING
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
1 Chapter 11 Neural Networks. 2 Chapter 11 Contents (1) l Biological Neurons l Artificial Neurons l Perceptrons l Multilayer Neural Networks l Backpropagation.
CS344 : Artificial Intelligence Pushpak Bhattacharyya CSE Dept., IIT Bombay Lecture 29 Introducing Neural Nets.
Artificial Neural Network Building Using WEKA Software
Linear Discrimination Reading: Chapter 2 of textbook.
Non-Bayes classifiers. Linear discriminants, neural networks.
Digital Image Processing Lecture 25: Object Recognition Prof. Charlene Tsai.
CS 478 – Tools for Machine Learning and Data Mining Perceptron.
Artificial Neural Networks Students: Albu Alexandru Deaconescu Ionu.
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
Neural Networks Presented by M. Abbasi Course lecturer: Dr.Tohidkhah.
Artificial Neural Networks Chapter 4 Perceptron Gradient Descent Multilayer Networks Backpropagation Algorithm 1.
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
COMP53311 Other Classification Models: Neural Network Prepared by Raymond Wong Some of the notes about Neural Network are borrowed from LW Chan’s notes.
Artificial Intelligence CIS 342 The College of Saint Rose David Goldschmidt, Ph.D.
Neural Networks Si Wu Dept. of Informatics PEV III 5c7 Spring 2008.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
March 31, 2016Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms I 1 … let us move on to… Artificial Neural Networks.
Learning with Neural Networks Artificial Intelligence CMSC February 19, 2002.
Multilayer Perceptrons
Learning with Perceptrons and Neural Networks
Artificial Intelligence (CS 370D)
Real Neurons Cell structures Cell body Dendrites Axon
Dr. Unnikrishnan P.C. Professor, EEE
with Daniel L. Silver, Ph.D. Christian Frey, BBA April 11-12, 2017
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Intelligent Leaning -- A Brief Introduction to Artificial Neural Networks Chiung-Yao Fang.
Artificial Intelligence Lecture No. 28
The Naïve Bayes (NB) Classifier
Presentation transcript:

Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning

THE NAÏVE BAYES CLASSIFIER In the naïve Bayes classification scheme, the required estimate of the pdf at a point x=[x(1),...,x(l)]T ∈ R l is given as That is, the components of the feature vector x are assumed to be statistically independent.

Machine Learning Example. Generate a set X1 that consists of N1 = 50 5-dimensional data vectors that stem from two classes, ω1 and ω2. The classes are modeled by Gaussian distributions with means m1 = [0,0,0,0,0]T and m2 = [1,1,1,1,1]T and respective covariance matrices

Machine Learning Step 1. Classify the points of the test set X2 using the naive Bayes classifier, where for a given x, p(x|ωi ) is estimated as

Machine Learning Step 2. Compute the ML estimates of m1, m2, S1, and S2 using X1. Employ the ML estimates in the Bayesian classifier

Machine Learning Step 3. Compare the results obtained in steps 1 and 2. The resulting classification errors—naive_error and Bayes_ML_error—are and , respectively. In other words, the naive classification scheme outperforms the standard ML-based scheme.

Machine Learning The techniques that are built around the optimal Bayesian classifier rely on the estimation of the pdf functions describing the data distribution in each class.

Machine Learning

The focus is on the direct design of a discriminant function/decision surface that separates the classes in some optimal sense according to an adopted criterion.

Machine Learning Machine learning involves adaptive mechanisms that enable computers to learn from experience, learn by example and learn by analogy. Learning capabilities can improve the performance of an intelligent system over time. The most popular approaches to machine learning are artificial neural networks and genetic algorithms. This lecture is dedicated to neural networks.

Cell structures –Cell body –Dendrites –Axon –Synaptic terminals Machine Learning

Networks of processing units (neurons) with connections (synapses) between them Large number of neurons: Large connectitivity: 10 5 Parallel processing Distributed computation/memory Robust to noise, failures Machine Learning

Understanding the Brain Levels of analysis (Marr, 1982) 1.Computational theory 2.Representation and algorithm 3.Hardware implementation Reverse engineering: From hardware to theory Parallel processing: SIMD vs MIMD Machine Learning

Real Neural Learning Synapses change size and strength with experience. Hebbian learning: When two connected neurons are firing at the same time, the strength of the synapse between them increases. “Neurons that fire together, wire together.” Machine Learning

Neural Network Learning Learning approach based on modeling adaptation in biological neural systems. Perceptron: Initial algorithm for learning simple neural networks (single layer) developed in the 1950’s. Backpropagation: More complex algorithm for learning multi-layer neural networks developed in the 1980’s. Machine Learning

Perceptron Learning Algorithm First neural network learning model in the 1960’s Simple and limited (single layer models) Basic concepts are similar for multi-layer models so this is a good learning tool Still used in many current applications Machine Learning

Perceptron Node – Threshold Logic Unit x1x1 xnxn x2x2 w1w1 w2w2 wnwn z Machine Learning

x1x1 xnxn x2x2 w1w1 w2w2 wnwn z Learn weights such that an objective function is maximized. What objective function should we use? What learning algorithm should we use? Machine Learning

Perceptron Learning Algorithm x1x1 x2x2 z x1x1 x2x2 t Machine Learning

First Training Instance.8.3 z net =.8*.4 +.3*-.2 =.26 =1 x1x1 x2x2 t Machine Learning

Second Training Instance.4.1 z x1x1 x2x2 t net =.4*.4 +.1*-.2 =.14 =1 Machine Learning

Thank you!