Bayesian Network By DengKe Dong. Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation.

Slides:



Advertisements
Similar presentations
Chapter 11 Inferences About Population Variances
Advertisements

Exact Inference in Bayes Nets
Identifying Conditional Independencies in Bayes Nets Lecture 4.
HistCite 结果分析示例 罗昭锋. By:SC 可能原因:文献年度过窄,少有相互引用.
位置相关查询处理 研究背景及意义 移动计算、无线通信以及定位技术的快速发展,使 得位置相关的查询处理及基于位置的信息服务技术 已经成为一个热点研究领域 。 大量的应用领域 ( 如地理信息系统、智能导航、交 通管制、天气预报、军事、移动电子商务等 ) 均迫 切需要有效地查询这些数据对象。
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
一、拟合优度检验 二、变量的显著性检验 三、参数的置信区间
计算机 在分析化学的应用 ( 简介 ) 陈辉宏. 一. 概述 信息时代的来临, 各门学科的研究方法都 有了新的发展. 计算机的介入, 为分析化学的进展提供了 一种更方便的研究方法.
Graphene Double Quantum Dot Transport Property Zhan Su Jan. 12, 2011.
绪 论绪 论绪 论绪 论 南京信息工程大学物理实验教学中心 第一次布置的作业 P37/3, 6P37/3, 6 作业做在实验报告册上!!
吉林大学远程教育课件 主讲人 : 杨凤杰学 时: 64 ( 第六十二讲 ) 离散数学. 最后,我们构造能识别 A 的 Kleene 闭包 A* 的自动机 M A* =(S A* , I , f A* , s A* , F A* ) , 令 S A* 包括所有的 S A 的状态以及一个 附加的状态 s.
Bayes Nets Rong Jin. Hidden Markov Model  Inferring from observations (o i ) to hidden variables (q i )  This is a general framework for representing.
例9:例9: 第 n-1 行( -1 )倍加到第 n 行上,第( n-2 ) 行( -1 )倍加到第 n-1 行上,以此类推, 直到第 1 行( -1 )倍加到第 2 行上。
Bayesian Network Representation Continued
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
线性代数习题课 吉林大学 术洪亮 第一讲 行 列 式 前面我们已经学习了关 于行列式的概念和一些基本 理论,其主要内容可概括为:
信息利用与学术论文写作 Library of Jiangsu University, Zhenjiang Sha Zhenjiang
Today Logistic Regression Decision Trees Redux Graphical Models
Bayesian Networks Alan Ritter.
Computer vision: models, learning and inference Chapter 10 Graphical Models.
9的乘法口诀 1 .把口诀说完全。 二八( ) 四六( ) 五八( ) 六八( ) 三七( ) 三八( ) 六七( ) 五七( ) 五六( ) 十六 四十八 四十二 二十四 二十一 三十五 四十 二十四 三十 2 .口算, 并说出用的是哪句口诀。 8×8= 4×6= 7×5= 6×8= 5×8=
吉林大学远程教育课件 主讲人 : 杨凤杰学 时: 64 ( 第五十三讲 ) 离散数学. 定义 设 G= ( V , T , S , P ) 是一个语法结构,由 G 产生的语言 (或者说 G 的语言)是由初始状态 S 演绎出来的所有终止符的集合, 记为 L ( G ) ={w  T *
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
周期信号的傅里叶变换. 典型非周期信号 ( 如指数信号, 矩形信号等 ) 都是满足绝对可 积(或绝对可和)条件的能量信号,其傅里叶变换都存在, 但绝对可积(或绝对可和)条件仅是充分条件, 而不是必 要条件。引入了广义函数的概念,在允许傅里叶变换采用 冲激函数的前提下, 使许多并不满足绝对可积条件的功率.
Machine Learning CUNY Graduate Center Lecture 21: Graphical Models.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
1 Monte Carlo Artificial Intelligence: Bayesian Networks.
机器学习 陈昱 北京大学计算机科学技术研究所 信息安全工程研究中心. 课程基本信息  主讲教师:陈昱 Tel :  助教:程再兴, Tel :  课程网页:
1 、如果 x + 5 > 4 ,那么两边都 可得 x >- 1 2 、在- 3y >- 4 的两边都乘以 7 可得 3 、在不等式 — x≤5 的两边都乘以- 1 可得 4 、将- 7x — 6 < 8 移项可得 。 5 、将 5 + a >- 2 a 移项可得 。 6 、将- 8x < 0.
? 小数乘整数 制作人:吴运粮 复习 1.下面乘积得多少? 8 × 3= 8 × 3用加法表示什么意思? 3个8相加 24.
Ch 8. Graphical Models Pattern Recognition and Machine Learning, C. M. Bishop, Revised by M.-O. Heo Summarized by J.W. Nam Biointelligence Laboratory,
基于加权词汇衔接的文档级机 器翻译自动评价 贡正仙 李良友 苏州大学计算机科学与技术学院
Announcements Project 4: Ghostbusters Homework 7
PATTERN RECOGNITION AND MACHINE LEARNING CHAPTER 8: GRAPHICAL MODELS.
Bayesian Network By Zhang Liliang. Key Point Today Intro to Bayesian Network Usage of Bayesian Network Reasoning BN: D-separation.
1 CMSC 671 Fall 2001 Class #21 – Tuesday, November 13.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
第三章 正弦交流电路.
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2008 Readings: K&F: 3.1, 3.2, –  Carlos.
INTRODUCTION TO MACHINE LEARNING 3RD EDITION ETHEM ALPAYDIN © The MIT Press, Lecture.
力的合成 力的合成 一、力的合成 二、力的平行四边形 上一页下一页 目 录 退 出. 一、力的合成 O. O. 1. 合力与分力 我们常常用 一个力来代替几个力。如果这个 力单独作用在物体上的效果与原 来几个力共同作用在物体上的效 果完全一样,那么,这一个力就 叫做那几个力的合力,而那几个 力就是这个力的分力。
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
数学广角——优化 沏茶问题.
Lecture 2: Statistical learning primer for biologists
个体 精子 卵细胞 父亲 受精卵 母亲 人类生活史 问题:人类产生配子(精、卵 细胞)是不是有丝分裂?
同分母分数加、减法 分数的初步认识 绿色圃中小学教育网
用 9 加几解决问题 北京小学 石 颖 第八单元 20 以内的进位加法. 一、口算练习,复习旧知 9+5 = 9+7 = 109 快来算一算! 我们一起看算式,抢答结果,看谁算得又对又快! 说一说你是怎么计算 9+5 这道题的。 2+9 = 5+9 =
Christopher M. Bishop, Pattern Recognition and Machine Learning 1.
Machine Learning – Lecture 11
1 CMSC 671 Fall 2001 Class #20 – Thursday, November 8.
Pattern Recognition and Machine Learning
1 BN Semantics 2 – Representation Theorem The revenge of d-separation Graphical Models – Carlos Guestrin Carnegie Mellon University September 17.
人 有 悲 欢 离 合, 月有阴晴圆缺。月有阴晴圆缺。 华师大版七年级数学第二册 海口市第十中学 数学组 吴锐.
Markov Random Fields in Vision
1 BN Semantics 1 Graphical Models – Carlos Guestrin Carnegie Mellon University September 15 th, 2006 Readings: K&F: 3.1, 3.2, 3.3.
目录 上页 下页 返回 结束 二、无界函数反常积分的审敛法 * 第五节 反常积分 无穷限的反常积分 无界函数的反常积分 一、无穷限反常积分的审敛法 反常积分的审敛法  函数 第五章 第五章.
Machine Learning and Graphical Models 王立威 北京大学 信息科学技术学院 1 (Lecture I)
SCI 数据库检索练习参考 本练习完全依照 SCI 数据库实际检索过程而 实现。 本练习完全依照 SCI 数据库实际检索过程而 实现。 练习中,选择了可以举一反三的题目,读 者可以根据题目进行另外的检索练习,如: 可将 “ 与 ” 运算检索改为 “ 或 ” 、 “ 非 ” 运算检索 等等。 练习中,选择了可以举一反三的题目,读.
§7.2 估计量的评价标准 上一节我们看到,对于总体 X 的同一个 未知参数,由于采用的估计方法不同,可 能会产生多个不同的估计量.这就提出一 个问题,当总体的一个参数存在不同的估 计量时,究竟采用哪一个好呢?或者说怎 样评价一个估计量的统计性能呢?下面给 出几个常用的评价准则. 一.无偏性.
CS 2750: Machine Learning Bayesian Networks Prof. Adriana Kovashka University of Pittsburgh March 14, 2016.
Artificial Intelligence Bayes’ Nets: Independence Instructors: David Suter and Qince Li Course Harbin Institute of Technology [Many slides.
INTRODUCTION TO Machine Learning 2nd Edition
CS 2750: Machine Learning Directed Graphical Models
CAP 5636 – Advanced Artificial Intelligence
EMOTIV EPOC ( EEG ) 佩戴式蓝牙脑电波 上海盛评检测技术有限公司 2018 市场研究项目.
Class #19 – Tuesday, November 3
CS 188: Artificial Intelligence
Class #16 – Tuesday, October 26
Readings: K&F: 5.1, 5.2, 5.3, 5.4, 5.5, 5.6, 5.7 Markov networks, Factor graphs, and an unified view Start approximate inference If we are lucky… Graphical.
BN Semantics 3 – Now it’s personal! Parameter Learning 1
Presentation transcript:

Bayesian Network By DengKe Dong

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Intro to Graphical Model  Two types of graphical models: -- Directed graphs (aka Bayesian Networks) -- Undirected graphs (aka Markov Random Fields)  Graphical Structure Plus associated parameters define joint probability distribution over of variables / nodes

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Conditional Independence  Definition X is conditionally independent of Y given Z, if the probability distribution governing X is independent of the value of Y, given the value of Z we denote it as: P (X ⊥ Y | Z), if

Conditional Independence  Condition on its parents A conditional probability distribution ( CPD ) is associated with each node N, defining as: P(N | Parents(N)) where the function Parents(N) returns the set of N’s immediate parents

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Bayesian Network Definition  Definition: A directed acyclic graph defining a joint probability distribution over a set of variables, where each node denotes a random variable, and each edge denotes the dependence between the connected nodes.  for example :

Bayesian Network Definition  Conditional Independencies in Bayesian Network: Each node is conditionally independent of its non-descendents, given only its immediate parents, So, the joint distribution over all variables in the network is defined in terms of these CPD’s, plus the graph.

Bayesian Network Definition  example: Chain rules for Probability: P(S,L,R,T,W) = P(S)P(L|S)P(R|S,L)P(T|S,L,R)P(W|S,L,R,T) CPD for each node Xi describing as P(Xi | Pa(Xi)): P(S,L,R,T,W) = P(S)P(L|S)P(R|S)P(T|L)P(W|L,R) So, in a Bayes net

Bayesian Network Definition  Construction Choose an ordering over variables, e.g. X1, X2, …, Xn For i=1 to n Add Xi to the network Select parents Pa(Xi) as minimal subset of X1…Xi-1, such that Notice this choice of parents assures

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Reasoning BN: D-Separation  Conditional Independence, Revisited We said: -- Each node is conditionally independent of its non-descendents, given its immediate parents Does this rule given us all of the conditional independence relations implied by the Bayes Network ? a.NO b.E.g., X1 and X4 are conditionally independent given {X2, X3} c.But X1 and X4 not conditionally independent given X3 d.For this, we need to understand D-separation …

Reasoning BN: D-Separation  Three example to understand D-Separation Head to Tail Tail to Tail Head to Head

Reasoning BN: D-Separation

X and Y are conditionally independent given Z, if and only if X and Y are D-separated by Z Suppose we have three sets of random variables: X, Y and Z X and Y are D-separated by Z (and therefore conditionally independence, given Z) iff every path from any variable in X to any variable in Y is blocked A path from variable A to variable B is blocked if it includes a node such that either arrows on the path meet either head-to-tail or tail-to-tail at the node and this node is in Z the arrows meet head-to-head at the node, and neither the node, nor any of its descendants, is in Z

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Inference In Bayesian Network In general, intractable (NP-Complete) For certain case, tractable Assigning probability to fully observed set of variables Or if just one variable unobserved Or for singly connected graphs (ie., no undirected loops) Variable elimination Belief propagation For multiply connected graphs Junction tree Sometimes use Monte Carlo method Generate many samples according to the Bayes Net distribution, then count up the results Variational methods for tractable approximate solutions

Inference In Bayesian Network  Prob. of Joint assignment: easy Suppose we are interested in joint assignment What is P(f,a,s,h,n)?

Inference In Bayesian Network  Prob. of Joint assignment: easy Suppose we are interested in joint assignment What is P(f,a,s,h,n)?

Inference In Bayesian Network  Prob. of marginals: not so easy How do we calculate P(N = n) ?

Inference In Bayesian Network

Inference On a Chain  Converting Directed to Undirected Graphs

Inference On a Chain  Converting Directed to Undirected Graphs

Inference On a Chain Compute the marginals

Inference On a Chain Compute the marginals

Inference On a Chain Compute the marginals

Inference On a Chain Compute the marginals

Inference On a Chain

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Belief Propagation  基于 贝 叶斯网 络、 MRF’s 、 因子 图 的置信 传 播算法已分别被开 发 出来 , 基于 不同模型的置信 传 播算法在数学上是等价的 。 从叙述 简 洁的角度考 虑,这 里 主要介 绍 基于 MRF’s 的 标 准置信 传 播算法 。

Belief Propagation

 在 实际计 算中 , belief 从 图边缘 的 结 点开始 计 算 , 并且只 更新所有必需信息都已知的信息 。 利用公式 1 和公式 2 , 依次 计 算每个 结 点的 belief 。  对 于无 环 的 MRF’s , 通常情况下 , 每个信息只需 计 算一次 , 极大地提升了 计 算效率

Key Points Today  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Thanks

Learning in Bayesian Network  What you should know

Learning in Bayesian Network

Learning CPTs from Fully Observed Data

MLE estimate of from fully observed data

Learning in Bayesian Network

EM Algorithm

Using Unlabeled Data to Help Train Naïve Bayes Classifier

Summary  Intro to Graphical Model  Conditional Independence  Intro to Bayesian Network  Reasoning BN: D-Separation  Inference In Bayesian Network  Belief Propagation  Learning in Bayesian Network

Thanks