.. Likelihood Computation  Given a Bayesian network and evidence e, compute P( e ) Sum over all possible values of unobserved variables Bet1Die Win1.

Slides:



Advertisements
Similar presentations
Mapping genes with LOD score method
Advertisements

Variational Methods for Graphical Models Micheal I. Jordan Zoubin Ghahramani Tommi S. Jaakkola Lawrence K. Saul Presented by: Afsaneh Shirazi.
CS498-EA Reasoning in AI Lecture #15 Instructor: Eyal Amir Fall Semester 2011.
. Exact Inference in Bayesian Networks Lecture 9.
1 Some Comments on Sebastiani et al Nature Genetics 37(4)2005.
Exact Inference in Bayes Nets
METHODS FOR HAPLOTYPE RECONSTRUCTION
Junction Trees And Belief Propagation. Junction Trees: Motivation What if we want to compute all marginals, not just one? Doing variable elimination for.
Tutorial #5 by Ma’ayan Fishelson. Input Format of Superlink There are 2 input files: –The locus file describes the loci being analyzed and parameters.
Dynamic Bayesian Networks (DBNs)
Genetic linkage analysis Dotan Schreiber According to a series of presentations by M. Fishelson.
Basics of Linkage Analysis
. Learning – EM in ABO locus Tutorial #08 © Ydo Wexler & Dan Geiger.
GS 540 week 6. HMM basics Given a sequence, and state parameters: – Each possible path through the states has a certain probability of emitting the sequence.
Bayesian network inference
10/24  Exam on 10/26 (Lei Tang and Will Cushing to proctor)
Inference in Bayesian Nets
Using the EM algorithm to find the recombination values with the maximal likelihood for given evidence. Submitted by : Galia Shlezinger and Liat Ramati.
Tutorial by Ma’ayan Fishelson Changes made by Anna Tzemach.
Hidden Markov Model Special case of Dynamic Bayesian network Single (hidden) state variable Single (observed) observation variable Transition probability.
. Learning Bayesian networks Slides by Nir Friedman.
. Bayesian Networks For Genetic Linkage Analysis Lecture #7.
. Basic Model For Genetic Linkage Analysis Lecture #3 Prepared by Dan Geiger.
Haplotyping via Perfect Phylogeny Conceptual Framework and Efficient (almost linear-time) Solutions Dan Gusfield U.C. Davis RECOMB 02, April 2002.
Belief Propagation, Junction Trees, and Factor Graphs
Graphical Models Lei Tang. Review of Graphical Models Directed Graph (DAG, Bayesian Network, Belief Network) Typically used to represent causal relationship.
CASE STUDY: Genetic Linkage Analysis via Bayesian Networks
. Approximate Inference Slides by Nir Friedman. When can we hope to approximate? Two situations: u Highly stochastic distributions “Far” evidence is discarded.
Bayesian Networks Alan Ritter.
1 Bayesian Networks Chapter ; 14.4 CS 63 Adapted from slides by Tim Finin and Marie desJardins. Some material borrowed from Lise Getoor.
Tutorial #5 by Ma’ayan Fishelson
General Explanation There are 2 input files –The locus file describes the loci being analyzed and parameters for the different analyzing programs. –The.
Aspects of Bayesian Inference and Statistical Disclosure Control in Python Duncan Smith Confidentiality and Privacy Group CCSR University of Manchester.
. Basic Model For Genetic Linkage Analysis Lecture #5 Prepared by Dan Geiger.
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Unit 3 We are learning to use a linear model to examine part-whole relationships and their connection to percents. We are developing strategies to find.
Automated Planning and Decision Making Prof. Ronen Brafman Automated Planning and Decision Making 2007 Bayesian networks Variable Elimination Based on.
Inference Complexity As Learning Bias Daniel Lowd Dept. of Computer and Information Science University of Oregon Joint work with Pedro Domingos.
1 Population Genetics Basics. 2 Terminology review Allele Locus Diploid SNP.
Module networks Sushmita Roy BMI/CS 576 Nov 18 th & 20th, 2014.
Probabilistic Networks Chapter 14 of Dechter’s CP textbook Speaker: Daniel Geschwender April 1, 2013 April 1&3, 2013DanielG--Probabilistic Networks1.
Lecture 13: Linkage Analysis VI Date: 10/08/02  Complex models  Pedigrees  Elston-Stewart Algorithm  Lander-Green Algorithm.
Lecture 3: Statistics Review I Date: 9/3/02  Distributions  Likelihood  Hypothesis tests.
The famous “sprinkler” example (J. Pearl, Probabilistic Reasoning in Intelligent Systems, 1988)
CPSC 422, Lecture 11Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 11 Oct, 2, 2015.
Genetic Algorithms. The Basic Genetic Algorithm 1.[Start] Generate random population of n chromosomes (suitable solutions for the problem) 2.[Fitness]
Bayesian networks and their application in circuit reliability estimation Erin Taylor.
Wei Sun and KC Chang George Mason University March 2008 Convergence Study of Message Passing In Arbitrary Continuous Bayesian.
Exact Inference in Bayes Nets. Notation U: set of nodes in a graph X i : random variable associated with node i π i : parents of node i Joint probability:
. Basic Model For Genetic Linkage Analysis Prepared by Dan Geiger.
Chapter 7 Outline 7.1 Linked Genes Do Not Assort Independently, Linked Genes Segregate Together, and Crossing Over Produces Recombination between.
Bayes network inference  A general scenario:  Query variables: X  Evidence (observed) variables and their values: E = e  Unobserved variables: Y 
Digital Camera and Computer Vision Laboratory Department of Computer Science and Information Engineering National Taiwan University, Taipei, Taiwan, R.O.C.
Probabilistic Approaches to Phylogenies BMI/CS 576 Sushmita Roy Oct 2 nd, 2014.
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Distributed cooperation and coordination using the Max-Sum algorithm
Computational methods for inferring cellular networks II Stat 877 Apr 17 th, 2014 Sushmita Roy.
Gonçalo Abecasis and Janis Wigginton University of Michigan, Ann Arbor
Inference in Bayesian Networks
Today.
Exam Preparation Class
Exact Inference Continued
A Short Tutorial on Causal Network Modeling and Discovery
CSCI 5822 Probabilistic Models of Human and Machine Learning
CS498-EA Reasoning in AI Lecture #20
Complexity Analysis Variable Elimination Inference Probabilistic
Basic Model For Genetic Linkage Analysis Lecture #3
Lecture 3: Exact Inference in GMs
Genetic linkage analysis
Learning Bayesian networks
Presentation transcript:

.

Likelihood Computation  Given a Bayesian network and evidence e, compute P( e ) Sum over all possible values of unobserved variables Bet1Die Win1 e = { Win1 = true }

The Basic Concept P(e,Die=1)= P(e,Die=3)= P(e,Die=5) and P(e,Die=2)= P(e,Die=4)= P(e,Die=6) Bet1Die Win1 Val(Bet1) = {odd,even} e = { Win1=true }  The exact value of Die need not be known to calculate exact likelihood  Group values, calculate once for each group

Value Abstraction Val(Die)  Val(Die a ) {1,4} {2,5,6} {3} A partition of a variable’s domain Bet1Die Win1

Safe Value Abstraction An abstraction is safe w.r.t. evidence e if Preserves likelihood information Val(Die)  Val(Die a ) {1,3,5} {2,4,6} Bet1Die Win1

Win 2 Bet2 Val(Bet2)={ 1-2, 3-6 } {1,3,5} {2,4,6} {1} {2} {3,5} {4,6} {1,3,5} {2,4,6} e = {Win1=true}e = {Win1=true, Win2=true} Bet1Die Win1 Safe Value Abstraction Win1 Val(Bet1)={odd, even } Win 2 A safe abstraction for Val(Die)  Need to refine Refinement

Cautious Value Abstraction Bet1Die Win1 Win 2 Bet2 Maximal abstraction - a tight refinement {2} {3,5} {4,6} {1} {1,3,5} {2,4,6} {1,2} {3-6} Win1=trueWin2=true Val(Bet1)={ odd, even } Val(Bet2)={ 1-2, 3-6 }

Abstracting a Bayesian Net  An abstraction of X i implies a partition of Pa i ’ s values  Abstract each variable after it’s children are abstracted, use a tight refinement of all partitions implied by children Output - G a : For each variable: 1. Calculate maximal abstraction 2. Propagate to parents X Initialization: Abstract observed variables  Linear in # variables and network representation

The Application-Genetic Linkage Analysis u The goal - Find the location of a disease (target) gene on a chromosome relative to some other (known) locations Map of human chromosome 16 Known loci

Recombination Fraction AB No crossover between A and BCrossover between A and B P =  P = 1- 

u The data - pedigrees Linkage Analysis /3

u Converting the pedigree to a Bayesian net One locus: The Probabilistic Model Orange - genotype nodes Blue - phenotype nodes Red - selector nodes (these represent linkage)

Locus # Locus #2 The Probabilistic Model u More than 1 locus: s2s2 s1s1

1e+6 1e e e+51e+7 Abstracted Original Clique-tree size Abstracted Original Network size Experimental Evaluation u 90 pedigrees (5-200 individuals) from 10 studies u Total of 280 linkage analysis problems Varied number of loci # loci:

Bet1Die Win 2 Bet2 Win1 Abstracting Multiple Variables {1,3,5} {2,4,6} Die a odd even odd even Bet1 a loss win 1,o 2,o 3,o 4,o 5,o 6,o 1,e 2,e 3,e 4,e 5,e 6,e

Clique-Tree Elimination X,V,U X,W X,V,Z V,Y C1C1 C2C2 C3C3 C4C4

Message-Specific Abstraction X,V,U X,W X,V,Z V,Y C1C1 C2C2 C3C3 C4C4 Given safe abstractions for f 3, m 1  3, m 2  3 - construct a safe abstraction for m 3  4 Refinement  multiplication Projection  summation  is safe for message m if Use dynamic programming to efficiently compute a safe abstraction for the whole tree

Experimental Evaluation u How much more can we save ? # loci: Abstracted clique-tree Abstracted network e e+6 Clique-tree size ratio Abstracted network e+6 Clique-tree sizeRatio

Total Reduction Clique-size Ratio (orig/abs) Problem size (#individuals X #genotypes) e+006 1e+007 1e+008 1e

Summary u Safe abstraction w.r.t. specific evidence u An algorithm to reduce problem complexity H Linear in net representation H Independent of inference procedure Motivated by VITESSE[ ] u Further reductions with inference procedure known u Caveats l As costly as inference H Cost is ammortized when used for e.g. parameter estimation l Representation of abstractions