Executive Summary: Bayes Nets Andrew W. Moore Carnegie Mellon University, School of Computer Science Note to other teachers.

Slides:



Advertisements
Similar presentations
Bayesian networks Chapter 14 Section 1 – 2. Outline Syntax Semantics Exact computation.
Advertisements

Topic Outline Motivation Representing/Modeling Causal Systems
For Monday Read chapter 18, sections 1-2 Homework: –Chapter 14, exercise 8 a-d.
2 SUPPLY AND DEMAND I: HOW MARKETS WORK. Copyright © 2006 Thomson Learning 4 The Market Forces of Supply and Demand.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
Review: Bayesian learning and inference
An introduction to time series approaches in biosurveillance Professor The Auton Lab School of Computer Science Carnegie Mellon University
10/24  Exam on 10/26 (Lei Tang and Will Cushing to proctor)
Bayesian Networks. Motivation The conditional independence assumption made by naïve Bayes classifiers may seem to rigid, especially for classification.
Bayesian networks Chapter 14 Section 1 – 2.
Bayesian Belief Networks
Bayesian Nets and Applications Today’s Reading: C. 14 Next class: machine learning C. 18.1, 18.2 Questions on the homework?
CPSC 322, Lecture 28Slide 1 Reasoning Under Uncertainty: More on BNets structure and construction Computer Science cpsc322, Lecture 28 (Textbook Chpt 6.3)
Constructing Belief Networks: Summary [[Decide on what sorts of queries you are interested in answering –This in turn dictates what factors to model in.
Copyright © 2004, Andrew W. Moore Naïve Bayes Classifiers Andrew W. Moore Professor School of Computer Science Carnegie Mellon University
Mining of Massive Datasets Jure Leskovec, Anand Rajaraman, Jeff Ullman Stanford University Note to other teachers and users of these.
Causal Modeling for Anomaly Detection Andrew Arnold Machine Learning Department, Carnegie Mellon University Summer Project with Naoki Abe Predictive Modeling.
Bayesian networks More commonly called graphical models A way to depict conditional independence relationships between random variables A compact specification.
Learning In Bayesian Networks. Learning Problem Set of random variables X = {W, X, Y, Z, …} Training set D = { x 1, x 2, …, x N }  Each observation specifies.
Judgment and Decision Making in Information Systems Computing with Influence Diagrams and the PathFinder Project Yuval Shahar, M.D., Ph.D.
Quiz 4: Mean: 7.0/8.0 (= 88%) Median: 7.5/8.0 (= 94%)
Daphne Koller Bayesian Networks Application: Diagnosis Probabilistic Graphical Models Representation.
Graphical Causal Models: Determining Causes from Observations William Marsh Risk Assessment and Decision Analysis (RADAR) Computer Science.
Reasoning with Bayesian Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities Useful.
Bayesian networks Chapter 14. Outline Syntax Semantics.
Lecture 1 is based on David Heckerman’s Tutorial slides. (Microsoft Research) Bayesian Networks Lecture 1: Basics and Knowledge- Based Construction Requirements:
Soft Computing Lecture 17 Introduction to probabilistic reasoning. Bayesian nets. Markov models.
Bayesian networks Chapter 14 Section 1 – 2. Bayesian networks A simple, graphical notation for conditional independence assertions and hence for compact.
CS 4100 Artificial Intelligence Prof. C. Hafner Class Notes March 13, 2012.
1 Causal Data Mining Richard Scheines Dept. of Philosophy, Machine Learning, & Human-Computer Interaction Carnegie Mellon.
Bayesian Networks What is the likelihood of X given evidence E? i.e. P(X|E) = ?
For Wednesday Read Chapter 11, sections 1-2 Program 2 due.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Reasoning with Bayesian Belief Networks. Overview Bayesian Belief Networks (BBNs) can reason with networks of propositions and associated probabilities.
Probability: Tree Diagrams Today you will learn to: calculate the probability of an event using tree diagrams. M07.D-S.3.1.1: Predict or determine whether.
Bayesian Nets and Applications. Naïve Bayes 2  What happens if we have more than one piece of evidence?  If we can assume conditional independence 
1 Auton Lab Walkerton Analysis. Proprietary Information. Early Analysis of Walkerton Data Version 11, June 18 th, 2005 Auton Lab:
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Learning In Bayesian Networks. General Learning Problem Set of random variables X = {X 1, X 2, X 3, X 4, …} Training set D = { X (1), X (2), …, X (N)
Sep 10th, 2001Copyright © 2001, Andrew W. Moore Learning Gaussian Bayes Classifiers Andrew W. Moore Associate Professor School of Computer Science Carnegie.
Review: Bayesian inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y.
CS 401R: Intro. to Probabilistic Graphical Models Lecture #6: Useful Distributions; Reasoning with Joint Distributions This work is licensed under a Creative.
Oct 15th, 2001Copyright © 2001, Andrew W. Moore Bayes Nets for representing and reasoning about uncertainty Andrew W. Moore Associate Professor School.
Zaawansowana Analiza Danych Wykład 3: Sieci Bayesowskie Piotr Synak.
Conditional Probability, Bayes’ Theorem, and Belief Networks CISC 2315 Discrete Structures Spring2010 Professor William G. Tanner, Jr.
Nov 30th, 2001Copyright © 2001, Andrew W. Moore PAC-learning Andrew W. Moore Associate Professor School of Computer Science Carnegie Mellon University.
Oct 29th, 2001Copyright © 2001, Andrew W. Moore Bayes Net Structure Learning Andrew W. Moore Associate Professor School of Computer Science Carnegie Mellon.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
PROBABILISTIC REASONING Heng Ji 04/05, 04/08, 2016.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
CSI: Children’s Science Investigative Interactive PowerPoint Template Report How does this work? Students participate (or view) an experiment and will.
Nov 20th, 2001Copyright © 2001, Andrew W. Moore VC-dimension for characterizing classifiers Andrew W. Moore Associate Professor School of Computer Science.
Bayesian Decision Theory Introduction to Machine Learning (Chap 3), E. Alpaydin.
Bayesian Nets and Applications Next class: machine learning C. 18.1, 18.2 Homework due next class Questions on the homework? Prof. McKeown will not hold.
BIO 101 Week 4 Learning Team Autoimmune Disease Describe a human autoimmune disease and address the following: Cause(s) Epidemiology Symptoms and diagnosis.
Bayesian networks Chapter 14 Section 1 – 2.
Presented By S.Yamuna AP/CSE
Markov Systems, Markov Decision Processes, and Dynamic Programming
Conditional Probability, Bayes’ Theorem, and Belief Networks
CS 4/527: Artificial Intelligence
A Bayesian Approach to Learning Causal networks
Causal Data Mining Richard Scheines
You are welcome to use or modify these PowerPoints as needed
CS 188: Artificial Intelligence Fall 2007
You are welcome to use or modify these PowerPoints as needed
Support Vector Machines
Bayesian networks Chapter 14 Section 1 – 2.
Probabilistic Reasoning
Analysis of Large Graphs: Overlapping Communities
Presentation transcript:

Executive Summary: Bayes Nets Andrew W. Moore Carnegie Mellon University, School of Computer Science Note to other teachers and users of these slides. Andrew would be delighted if you found this source material useful in giving your own lectures. Feel free to use these slides verbatim, or to modify them to fit your own needs. PowerPoint originals are available. If you make use of a significant portion of these slides in your own lecture, please include this message, or the following link to the source repository of Andrew’s tutorials: Comments and corrections gratefully received.

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 2 Ice-Cream and Sharks Spreadsheet Recent Dow- Jones Change Number of Ice- Creams sold today Number of Shark Attacks today UP35004 STEADY410 UP23005 DOWN34004 UP180 STEADY1050 STEADY UP700

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 3 A simple Bayes Net Common Cause Shark Attacks Ice-Cream Sales

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 4 A bigger Bayes Net Number of People on Beach Shark Attacks Ice-Cream Sales Fish Migration Info Is it a Weekend? Weather These arrows can be interpreted… Causally (Then it’s called an influence diagram) Probabilistically (if I know the value of my parent nodes, then knowing other nodes above me would provide no extra information: Conditional Independence

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 5 The Guts of a Bayes Net Number of People on Beach Shark Attacks Ice-Cream Sales Fish Migration Info Is it a Weekend? Weather If Sunny and Weekend then Prob ( Beach Crowded ) = 90% If Sunny and Not Weekend then Prob ( Beach Crowded ) = 40% If Not Sunny and Weekend then Prob ( Beach Crowded ) = 20% If Not Sunny and Not Weekend then Prob ( Beach Crowded ) = 0%

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 6 Real-sized Bayes Nets How do you build them? From Experts and/or from Data! How do you use them? Predict values that are expensive or impossible to measure. Decide which possible problems to investigate first.

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 7 Building Bayes Nets Bayes nets are sometimes built manually, consulting domain experts for structure and probabilities. More often the structure is supplied by experts, but the probabilities learned from data. And in some cases the structure, as well as the probabilities, are learned from data.

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 8 Example: Pathfinder Pathfinder system. (Heckerman, Probabilistic Similarity Networks, MIT Press, Cambridge MA). Diagnostic system for lymph-node diseases. 60 diseases and 100 symptoms and test-results. 14,000 probabilities. Expert consulted to make net. 8 hours to determine variables. 35 hours for net topology. 40 hours for probability table values. Apparently, the experts found it quite easy to invent the causal links and probabilities. Pathfinder is now outperforming the world experts in diagnosis. Being extended to several dozen other medical domains.

Copyright © 2002, Andrew W. MooreShort Bayes Nets: Slide 9 Other Bayes Net Examples: Further Medical Examples (Peter Spirtes, Richard Scheines, CMU) Manufacturing System diagnosis (Wray Buntine, NASA Ames) Computer Systems diagnosis (Microsoft) Network Systems diagnosis Helpdesk (Support) troubleshooting (Heckerman, Microsoft) Information retrieval (Tom Mitchell, CMU) Customer Modeling Student Retention (Clarke Glymour, CMU Nomad Robot (Fabio Cozman, CMU)