Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational.

Slides:



Advertisements
Similar presentations
CS188: Computational Models of Human Behavior
Advertisements

CSE 473/573 Computer Vision and Image Processing (CVIP) Ifeoma Nwogu Lecture 27 – Overview of probability concepts 1.
A Tutorial on Learning with Bayesian Networks
Probabilistic models Jouni Tuomisto THL. Outline Deterministic models with probabilistic parameters Hierarchical Bayesian models Bayesian belief nets.
Continuous Random Variables Chapter 5 Nutan S. Mishra Department of Mathematics and Statistics University of South Alabama.
Lecture 4A: Probability Theory Review Advanced Artificial Intelligence.
Bayesian models for fMRI data
Graphical Models BRML Chapter 4 1. the zoo of graphical models Markov networks Belief networks Chain graphs (Belief and Markov ) Factor graphs =>they.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Introduction of Probabilistic Reasoning and Bayesian Networks
EE462 MLCV Lecture Introduction of Graphical Models Markov Random Fields Segmentation Tae-Kyun Kim 1.
Ai in game programming it university of copenhagen Statistical Learning Methods Marco Loog.
Classical inference and design efficiency Zurich SPM Course 2014
From: Probabilistic Methods for Bioinformatics - With an Introduction to Bayesian Networks By: Rich Neapolitan.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Continuous random variables Uniform and Normal distribution (Sec. 3.1, )
Artificial Intelligence and Lisp Lecture 7 LiU Course TDDC65 Autumn Semester, 2010
Today Logistic Regression Decision Trees Redux Graphical Models
Bayesian Networks Alan Ritter.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
A Brief Introduction to Graphical Models
12/07/2008UAI 2008 Cumulative Distribution Networks and the Derivative-Sum-Product Algorithm Jim C. Huang and Brendan J. Frey Probabilistic and Statistical.
Chapter 8 Day 1. The Binomial Setting - Rules 1. Each observations falls under 2 categories we call success/failure (coin, having a child, cards – heart.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Theory of Probability Statistics for Business and Economics.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Lecture 2: Combinatorial Modeling CS 7040 Trustworthy System Design, Implementation, and Analysis Spring 2015, Dr. Rozier Adapted from slides by WHS at.
Perceptual and Sensory Augmented Computing Machine Learning, Summer’11 Machine Learning – Lecture 13 Introduction to Graphical Models Bastian.
Bayesian Networks for Data Mining David Heckerman Microsoft Research (Data Mining and Knowledge Discovery 1, (1997))
Lecture on Bayesian Belief Networks (Basics) Patrycja Gradowska Open Risk Assessment Workshop Kuopio, 2009.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 28 of 41 Friday, 22 October.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
2-1 Sample Spaces and Events Random Experiments Figure 2-1 Continuous iteration between model and physical system.
Bayesian statistics Probabilities for everything.
Physics Fluctuomatics (Tohoku University) 1 Physical Fluctuomatics 12th Bayesian network and belief propagation in statistical inference Kazuyuki Tanaka.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
MATH 4030 – 4B CONTINUOUS RANDOM VARIABLES Density Function PDF and CDF Mean and Variance Uniform Distribution Normal Distribution.
LECTURE 17 THURSDAY, 22 OCTOBER STA 291 Fall
Cognitive Computer Vision Kingsley Sage and Hilary Buxton Prepared under ECVision Specific Action 8-3
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
Lecture 2: Statistical learning primer for biologists
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Machine Learning – Lecture 11
Introduction on Graphic Models
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability Primer Bayesian Brain Probabilistic Approaches to Neural Coding 1.1 A Probability.
Outline Historical note about Bayes’ rule Bayesian updating for probability density functions –Salary offer estimate Coin trials example Reading material:
Random Variables By: 1.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Stochasticity and Probability. A new approach to insight Pose question and think of the answer needed to answer it. Ask: How do the data arise? What is.
Graduate School of Information Sciences, Tohoku University
CHAPTER 5 Handling Uncertainty BIC 3337 EXPERT SYSTEM.
INTRODUCTION TO Machine Learning 2nd Edition
CS 2750: Machine Learning Review
CS 2750: Machine Learning Directed Graphical Models
CHAPTER 16: Graphical Models
Lecture on Bayesian Belief Networks (Basics)
Statistical Methods For Engineers
Propagation Algorithm in Bayesian Networks
Markov Random Fields Presented by: Vladan Radosavljevic.
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Graduate School of Information Sciences, Tohoku University
Presentation transcript:

Introduction to probability theory and graphical models Translational Neuroimaging Seminar on Bayesian Inference Spring 2013 Jakob Heinzle Translational Neuromodeling Unit (TNU) Institute for Biomedical Engineering (IBT) University and ETH Zürich

Literature and References Literature: Bishop (Chapters 1.2, 1.3, 8.1, 8.2) MacKay (Chapter 2) Barber (Chapters 1, 2, 3, 4) Many images in this lecture are taken from the above references. Bayesian Inference - Introduction to probability theory2

Probability distribution Bayesian Inference - Introduction to probability theory3 Bishop, Fig. 1.11

Probability theory: Basic rules Bayesian Inference - Introduction to probability theory4 * According to Bishop

Conditional and marginal probability Bayesian Inference - Introduction to probability theory5

Conditional and marginal probability Bayesian Inference - Introduction to probability theory6 Bishop, Fig. 1.11

Independent variables Bayesian Inference - Introduction to probability theory7 Question for later: What does this mean for Bayes?

Probability theory: Bayes’ theorem Bayesian Inference - Introduction to probability theory8

Rephrasing and naming of Bayes’ rule Bayesian Inference - Introduction to probability theory9 MacKay D: data,  : parameters, H: hypothesis we put into the model.

Example: Bishop Fig. 1.9 Bayesian Inference - Introduction to probability theory10 Box (B): blue (b) or red (r) Fruit (F): apple (a) or orange (o) p(B=r) = 0.4, p(B=b) = 0.6. What is the probability of having a red box if one has drawn an orange? Bishop, Fig. 1.9

Probability density Bayesian Inference - Introduction to probability theory11

PDF and CDF Bayesian Inference - Introduction to probability theory12 Bishop, Fig. 1.12

Cumulative distribution Bayesian Inference - Introduction to probability theory13 Short example: How to use the cumulative distribution to transform a uniform distribution!

Marginal densities Bayesian Inference - Introduction to probability theory14

Two views on probability Bayesian Inference - Introduction to probability theory15 ● Probability can … – … describe the frequency of outcomes in random experiments  classical interpretation. – … describe the degree of belief about a particular event  Bayesian viewpoint or subjective interpretation of probability. MacKay, Chapter 2

Expectation of a function Bayesian Inference - Introduction to probability theory16

Graphical models Bayesian Inference - Introduction to probability theory17 1.They provide a simple way to visualize the structure of a probabilistic model and can be used to design and motivate new models. 2.Insights into the properties of the model, including conditional independence properties, can be obtained by inspection of the graph. 3.Complex computations, required to perform inference and learning in sophisticated models, can be expressed in terms of graphical manipulations, in which underlying mathematical expressions are carried along implicitly. Bishop, Chap. 8

Graphical models overview Directed Graph Bayesian Inference - Introduction to probability theory18 For summary of definitions see Barber, Chapter 2 Undirected Graph Names: nodes (vertices), edges (links), paths, cycles, loops, neighbours

Graphical models overview Bayesian Inference - Introduction to probability theory19 Barber, Introduction

Graphical models Bayesian Inference - Introduction to probability theory20 Bishop, Fig. 8.1

Graphical models: parents and children Bayesian Inference - Introduction to probability theory21 Node a is a parent of node b, node b is a child of node a. Bishop, Fig. 8.1

Belief networks = Bayesian belief networks = Bayesian Networks Bayesian Inference - Introduction to probability theory22 Bishop, Fig. 8.2 In general: Every probability distribution can be expressed as a Directed acyclic graph (DAG) Important: No directed cycles!

Conditional independence Bayesian Inference - Introduction to probability theory23

Conditional independence – tail-to- tail path Bayesian Inference - Introduction to probability theory24 Is a independent of b? No!Yes! Bishop, Chapter 8.2

Conditional independence – head- to-tail path Bayesian Inference - Introduction to probability theory25 No!Yes! Bishop, Chapter 8.2 Is a independent of b?

Conditional independence – head- to-head path Bayesian Inference - Introduction to probability theory26 Yes!No! Bishop, Chapter 8.2 Is a independent of b?

Conditional independence – notation Bayesian Inference - Introduction to probability theory27 Bishop, Chapter 8.2

Conditional independence – three basic structures Bayesian Inference - Introduction to probability theory28 Bishop, Chapter 8.2.2

More conventions in graphical notations Bayesian Inference - Introduction to probability theory29 Bishop, Chapter 8 = = Regression modelShort formParameters explicit

More conventions in graphical notations Bayesian Inference - Introduction to probability theory30 Bishop, Chapter 8 Trained on data t n  Complete model used for prediction

Summary – things to remember Probabilities and how to compute with the  Product rule, Bayes’ Rule, Sum rule Probability densities  PDF, CDF Conditional and Marginal distributions Basic concepts of graphical models  Directed vs. Undirected, nodes and edges, parents and children. Conditional independence in graphs and how to check it. Bayesian Inference - Introduction to probability theory31 Bishop, Chapter 8.2.2