Bayes Net Inference/Learning Lab. Causal ordering Which of the following Bayes Nets exhibits a causal ordering of the variables? Battery dead No oil No.

Slides:



Advertisements
Similar presentations
Classification Classification Examples
Advertisements

Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Bayesian Networks. Quiz: Probabilistic Reasoning 1.What is P(F), the probability that some creature can fly? 2.Creature b is a bumble bee. What’s P(F|B),
INTRODUCTION TO MACHINE LEARNING Bayesian Estimation.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Probability: Review The state of the world is described using random variables Probabilities are defined over events –Sets of world states characterized.
Bayes Rule The product rule gives us two ways to factor a joint probability: Therefore, Why is this useful? –Can get diagnostic probability P(Cavity |
More probability CS151 David Kauchak Fall 2010 Some material borrowed from: Sara Owsley Sood and others.
Support Vector Machines and Margins
Chapter 4 Introduction to Probability
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
Overview Full Bayesian Learning MAP learning
AI-Class.com Dan Vasicek Overview of the course (so far) Problem Solving Probability in AI Probabilistic Inference Machine Learning Unsupervised.
Review: Bayesian learning and inference
Probabilistic inference
Assuming normally distributed data! Naïve Bayes Classifier.
Bayes Rule How is this rule derived? Using Bayes rule for probabilistic inference: –P(Cause | Evidence): diagnostic probability –P(Evidence | Cause): causal.
Bayesian Networks Chapter 2 (Duda et al.) – Section 2.11
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Statistical inference (Sec. )
Naïve Bayesian Classifiers Before getting to Naïve Bayesian Classifiers let’s first go over some basic probability theory p(C k |A) is known as a conditional.
Kernel Methods Part 2 Bing Han June 26, Local Likelihood Logistic Regression.
Naïve Bayes Classification Debapriyo Majumdar Data Mining – Fall 2014 Indian Statistical Institute Kolkata August 14, 2014.
Simple Bayesian Supervised Models Saskia Klein & Steffen Bollmann 1.
© Daniel S. Weld 1 Statistical Learning CSE 573 Lecture 16 slides which overlap fix several errors.
Pattern Recognition Topic 2: Bayes Rule Expectant mother:
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
1 A Tutorial on Bayesian Networks Modified by Paul Anderson from slides by Weng-Keen Wong School of Electrical Engineering and Computer Science Oregon.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Review: Probability Random variables, events Axioms of probability
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes April 3, 2012.
1 Probabilistic Language-Model Based Document Retrieval.
More Machine Learning Linear Regression Squared Error L1 and L2 Regularization Gradient Descent.
Bayesian networks Chapter 14. Outline Syntax Semantics.
Rule Generation [Chapter ]
Bayes Net Lab. 1. Absolute Independence vs. Conditional Independence For any three random variables X, Y, and Z, a)is it true that if X  Y, then X 
Topics on Final Perceptrons SVMs Precision/Recall/ROC Decision Trees Naive Bayes Bayesian networks Adaboost Genetic algorithms Q learning Not on the final:
Intro to Machine Learning Parameter Estimation for Bayes Nets Naïve Bayes.
Slide 1Fig 28-CO, p.858. Slide 2Fig 28-1, p.859 Slide 3Fig Q28-19, p.884.
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
Made by: Maor Levy, Temple University  Inference in Bayes Nets ◦ What is the probability of getting a strong letter? ◦ We want to compute the.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Introduction to LDA Jinyang Gao. Outline Bayesian Analysis Dirichlet Distribution Evolution of Topic Model Gibbs Sampling Intuition Analysis of Parameter.
Review: Probability Random variables, events Axioms of probability Atomic events Joint and marginal probability distributions Conditional probability distributions.
CHAPTER 6 Naive Bayes Models for Classification. QUESTION????
Section 7.4 Use of Counting Techniques in Probability.
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Inference Algorithms for Bayes Networks
1 Bayesian Networks: A Tutorial. 2 Introduction Suppose you are trying to determine if a patient has tuberculosis. You observe the following symptoms:
DATA MINING LECTURE 10b Classification k-nearest neighbor classifier
CPSC 7373: Artificial Intelligence Lecture 5: Probabilistic Inference Jiang Bian, Fall 2012 University of Arkansas at Little Rock.
1841f06detprob3 Testing Basics Detection probability.
CHAPTER 3: BAYESIAN DECISION THEORY. Making Decision Under Uncertainty Based on E Alpaydın 2004 Introduction to Machine Learning © The MIT Press (V1.1)
Weng-Keen Wong, Oregon State University © Bayesian Networks: A Tutorial Weng-Keen Wong School of Electrical Engineering and Computer Science Oregon.
Introduction to Information Retrieval Introduction to Information Retrieval Lecture 15: Text Classification & Naive Bayes 1.
CSE 468/568 Deadlines Lab1 grades out tomorrow (5/1) HW2 grades out by weekend Lab 3 grades out next weekend HW3 – Probability homework out Due 5/7 FINALS:
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
Conditional Probability
Learning Tree Structures
The Best Way To Lose Weight Naturally - f9health.com
CS 4/527: Artificial Intelligence
تصنيف التفاعلات الكيميائية
Still More Uncertainty
Sampling Distribution
Sampling Distribution
POINT ESTIMATOR OF PARAMETERS
1.7.2 Multinomial Naïve Bayes
Logistic Regression [Many of the slides were originally created by Prof. Dan Jurafsky from Stanford.]
Presentation transcript:

Bayes Net Inference/Learning Lab

Causal ordering Which of the following Bayes Nets exhibits a causal ordering of the variables? Battery dead No oil No gas Lights work Gas gauge Oil light Car won’t start Battery dead No oil No gas Lights work Gas gauge Oil light Car won’t start Graph AGraph B

Enumeration Compute the following probabilities for the Bayes Net below, using Enumeration: A BC AP(A) +a.6 -a.4 ABP(B|A) +a+b.6 +a-b.4 -a+b.7 -a-b.3 BCP(C|B) +b+c.8 +b-c.2 -b+c.3 -b-c.7 Compute: P(+c, -b, -a) P(+c, -b) P(-a | +c, -b)

Gibbs Sampling Query: P(-b | +d). Initial random sample: {-a, -b, +d, +c} C A B D AP(A) +a.6 BP(B) +b.2 ABCP(C|A,B) +a+b+c.5 +a-b+c.6 -a+b+c.2 -a-b+c.3 CDP(D|C) +c+d.7 -c+d.6 Variable chosen: A, r  0.5 Sample generated: Variable chosen: D, r  0.6 Sample generated: Query of Interest: P(-b | +d)

Maximum Likelihood Estimation B A C How many parameters need to be learned for this Bayes Net? Determine the maximum likelihood estimates for each of the parameters of this Bayes Net, using the dataset below. ABCCount +a+b+c20 +a+b-c25 +a-b+c55 +a-b-c40 -a+b+c5 -a+b-c75 -a-b+c40 -a-b-c120

Naïve Bayes Spam Classifier Below is a dataset of Spam and Ham messages. Draw the graph for a Bayes Net for a NBC for these messages. Clearly indicate what each variable means, and the edges between variables. You do not need to show parameters. MessageLabel lose 15 lbs for summerSpam lose weight fastSpam gain 15 lbs of muscle fastSpam visit this summerHam borrow a 15 lbs weightHam fast muscle tissueHam

Laplace Smoothing What is the number of classes K for your NBC? Use Laplace smoothing over these messages (same messages as before) to estimate parameters for your NBC for spam detection. MessageLabel lose 15 lbs for summerSpam lose weight fastSpam gain 15 lbs of muscle fastSpam visit this summerHam borrow a 15 lbs weightHam fast muscle tissueHam