CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes April 3, 2012.

Slides:



Advertisements
Similar presentations
Naïve Bayes. Bayesian Reasoning Bayesian reasoning provides a probabilistic approach to inference. It is based on the assumption that the quantities of.
Advertisements

Mixture Models and the EM Algorithm
Notes Sample vs distribution “m” vs “µ” and “s” vs “σ” Bias/Variance Bias: Measures how much the learnt model is wrong disregarding noise Variance: Measures.
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
Unsupervised Learning
INC 551 Artificial Intelligence Lecture 11 Machine Learning (Continue)
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 27, 2012.
Intelligent Environments1 Computer Science and Engineering University of Texas at Arlington.
An Overview of Machine Learning
CSC321: 2011 Introduction to Neural Networks and Machine Learning Lecture 10: The Bayesian way to fit models Geoffrey Hinton.
Naïve Bayes Classifier
What is Statistical Modeling
Classification and risk prediction
Statistical Inference Chapter 12/13. COMP 5340/6340 Statistical Inference2 Statistical Inference Given a sample of observations from a population, the.
Machine Learning CMPT 726 Simon Fraser University CHAPTER 1: INTRODUCTION.
Programme in Statistics (Courses and Contents). Elementary Probability and Statistics (I) 3(2+1)Stat. 101 College of Science, Computer Science, Education.
Introduction to Bayesian Learning Bob Durrant School of Computer Science University of Birmingham (Slides: Dr Ata Kabán)
Machine Learning CUNY Graduate Center Lecture 3: Linear Regression.
Data Mining: A Closer Look Chapter Data Mining Strategies (p35) Moh!
Visual Recognition Tutorial
Introduction to Bayesian Learning Ata Kaban School of Computer Science University of Birmingham.
Kernel Methods Part 2 Bing Han June 26, Local Likelihood Logistic Regression.
Simple Bayesian Supervised Models Saskia Klein & Steffen Bollmann 1.
1 Linear Classification Problem Two approaches: -Fisher’s Linear Discriminant Analysis -Logistic regression model.
CS Bayesian Learning1 Bayesian Learning. CS Bayesian Learning2 States, causes, hypotheses. Observations, effect, data. We need to reconcile.
CSCI 347 / CS 4206: Data Mining Module 04: Algorithms Topic 06: Regression.
Jeff Howbert Introduction to Machine Learning Winter Classification Bayesian Classifiers.
Additional Slides on Bayesian Statistics for STA 101 Prof. Jerry Reiter Fall 2008.
DATA MINING : CLASSIFICATION. Classification : Definition  Classification is a supervised learning.  Uses training sets which has correct answers (class.
Topics on Final Perceptrons SVMs Precision/Recall/ROC Decision Trees Naive Bayes Bayesian networks Adaboost Genetic algorithms Q learning Not on the final:
Midterm Review Rao Vemuri 16 Oct Posing a Machine Learning Problem Experience Table – Each row is an instance – Each column is an attribute/feature.
Machine Learning1 Machine Learning: Summary Greg Grudic CSCI-4830.
ENSEMBLE LEARNING David Kauchak CS451 – Fall 2013.
CPSC 502, Lecture 15Slide 1 Introduction to Artificial Intelligence (AI) Computer Science cpsc502, Lecture 16 Nov, 3, 2011 Slide credit: C. Conati, S.
Bayesian Networks Martin Bachler MLA - VO
Data Mining Practical Machine Learning Tools and Techniques Chapter 4: Algorithms: The Basic Methods Section 4.6: Linear Models Rodney Nielsen Many of.
ECE 8443 – Pattern Recognition LECTURE 07: MAXIMUM LIKELIHOOD AND BAYESIAN ESTIMATION Objectives: Class-Conditional Density The Multivariate Case General.
Computational Intelligence: Methods and Applications Lecture 12 Bayesian decisions: foundation of learning Włodzisław Duch Dept. of Informatics, UMK Google:
Oliver Schulte Machine Learning 726 Bayes Net Classifiers The Naïve Bayes Model.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 27, 2012.
Artificial Intelligence 8. Supervised and unsupervised learning Japan Advanced Institute of Science and Technology (JAIST) Yoshimasa Tsuruoka.
INTRODUCTION TO Machine Learning 3rd Edition
Perceptrons Gary Cottrell. Cognitive Science Summer School 2 Perceptrons: A bit of history Frank Rosenblatt studied a simple version of a neural net called.
Point Estimation of Parameters and Sampling Distributions Outlines:  Sampling Distributions and the central limit theorem  Point estimation  Methods.
Mehdi Ghayoumi MSB rm 132 Ofc hr: Thur, a Machine Learning.
Logistic Regression Saed Sayad 1www.ismartsoft.com.
1 Machine Learning: Lecture 6 Bayesian Learning (Based on Chapter 6 of Mitchell T.., Machine Learning, 1997)
Oct 29th, 2001Copyright © 2001, Andrew W. Moore Bayes Net Structure Learning Andrew W. Moore Associate Professor School of Computer Science Carnegie Mellon.
Naïve Bayes Classifier April 25 th, Classification Methods (1) Manual classification Used by Yahoo!, Looksmart, about.com, ODP Very accurate when.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Classification COMP Seminar BCB 713 Module Spring 2011.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Naive Bayes (Generative Classifier) vs. Logistic Regression (Discriminative Classifier) Minkyoung Kim.
Unsupervised Learning Part 2. Topics How to determine the K in K-means? Hierarchical clustering Soft clustering with Gaussian mixture models Expectation-Maximization.
CS 2750: Machine Learning Review
Classification of unlabeled data:
Data Mining Lecture 11.
Probabilistic Models for Linear Regression
CSE P573 Applications of Artificial Intelligence Bayesian Learning
In-Class Exercise: Discrete Distributions
CSE P573 Applications of Artificial Intelligence Bayesian Learning
Lecture 5 Unsupervised Learning in fully Observed Directed and Undirected Graphical Models.
N-Gram Model Formulas Word sequences Chain rule of probability
LECTURE 07: BAYESIAN ESTIMATION
Multivariate Methods Berlin Chen
Multivariate Methods Berlin Chen, 2005 References:
Machine Learning: UNIT-3 CHAPTER-1
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Presentation transcript:

CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes April 3, 2012

Term Project Presentations Thursday, April 12 Groups: Tuesday, April 17 Groups:

Naive Bayes Classifiers: Our next example of machine learning A supervised learning method Making independence assumption, we can explore a simple subset of Bayesian nets, such that: It is easy to estimate the CPT’s from sample data Uses a technique called “maximum likelihood estimation” – Given a set of correctly classified representative examples – Q: What estimates of conditional probabilities maximize the likelihood of the data that was observed? – A: The estimates that reflect the sample proportions

# Juniors # Non-Juniors were Juniors and were Non-Juniors

Class Exercise: Naive Bayes Classifier with multi-valued variables Major: Science, Arts, Social Science Student characteristics: Gender (M,F), Race/Ethnicity (W, B, H, A) International (T/F) What do the conditional probability tables look like??

Perceptron Leaning Algorithm and BackProp

Perceptron Learning (Supervised) Assign random weights (or set all to 0) Cycle through input data until change < target Let α be the “learning coefficient” For each input: – If perceptron gives correct answer, do nothing – If perceptron says yes when answer should be no, decrease the weights on all units that “fired” by α – If perceptron says no when answer should be yes, increase the weights on all units that “fired” by α