Download presentation
Presentation is loading. Please wait.
Published byAndrea Villadsen Modified over 6 years ago
1
Hub Node Detection in Directed Acyclic Graphs
Kim, Byoung-Hee I will talk about evolving DNA-encoded genetic programs in a test tube. We evaluate the potentials of this approach by solving a medical diagnosis problem on a simulated DNA computer. The individual genetic program represents a decision list of variable length and the whole population takes part in making probabilistic decisions.
2
Directions of Future Work on HBN
Algorithm Extension from binary-tree type structure to n-ary tree Fast and effective learning by hub node detection Incorporate features of latent variable models Theoretical Analysis Information compression – efficiency, measurement Application & Empirical Analysis Biological networks – gene regulatory networks, metabolic networks Text Social networks
3
Preliminary Experiment for Detecting Hub Nodes
… s5000 X1 X2 X100 sampling Estimation Structure Hub nodes Mutual information / Conditional MI 100 node scale-free/modular BN all nodes have binary states Conjectures D-separation vs conditionam mutual information d(x, y) > d(x, z) then I(x; y) < I(x; z) asymptotically (mutual information) the difference between I(X; Y) and I(X; Y|Z) increases as the degree of Z increases
4
Shortest path vs normalized mutual information
Data: 5,000 samples out of 100 node networks (1 scale free and 1 modular) d(x, y) > d(x, z) then I(x; y) < I(x; z) asymptotically ????
5
Example 1: some short paths btw X65 and X86
72 49 84 7 2 0.0693 0.0759 I(X65; X86) = 31 28 1 I(X65; X86 | Z) =? 0.0859 0.0821 26 17 0.0829 0.083
6
Example 2: some short paths btw X7 and X31
65 86 72 0.0177 0.023 0.0183 49 0.0224 84 7 0.0545 2 I(X7; X31) = I(X7; X31 | Z) = ? 31 28 1 26 17 0.0122
7
Experimetns to Verify Conjectures
d(x, y) > d(x, z) then I(x, y) < I(x, z) asymptotically Shortest path vs normalized mutual information Data: 5,000 samples out of 100 node networks (1 scale free and 1 modular)
8
Mutual Information In probability theory and information theory, the mutual information, or transinformation, of two random variables is a quantity that measures the mutual dependency of the two variables
9
Updates Code for the conditional MI has a bug
Couldn’t find any function for calculating MI/cMI in Matlab and R ‘empirical’ MI/cMI calculation: biased.. Need to reduce bias
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.