Download presentation
Presentation is loading. Please wait.
Published byRolf McDowell Modified over 6 years ago
1
Regulation Analysis using Restricted Boltzmann Machines
Network Modeling Seminar, 10/1/2013 Patrick Michl
2
Agenda Biological Problem Analysing the regulation of metabolism Modeling Implementation & Results
3
Biological Problem Analysing the regulation of metabolism
Signal Regulation Metabolism A linear metabolic pathway of enzymes (E) …
4
Biological Problem Analysing the regulation of metabolism
Signal Regulation Metabolism … is regulated by transcription factors (TF) …
5
Biological Problem Analysing the regulation of metabolism
Signal Regulation Metabolism … which respond to signals (S)
6
Biological Problem Analysing the regulation of metabolism
Upregulated linear pathways …
7
Biological Problem Analysing the regulation of metabolism
… can appear in different patterns
8
Biological Problem Analysing the regulation of metabolism
? ? E Which transcription factors and signals cause this patterns …
9
Biological Problem Analysing the regulation of metabolism
? ? ? ? E … and how do they interact? (topological structure)
10
Agenda Biological Problem Analysing the regulation of metabolism Network Modeling Restricted Boltzmann Machines (RBM) Validation & Implementation
11
Network Modeling Restricted Boltzmann Machines (RBM)
TF E Lets start with some pathway of our interest …
12
Network Modeling Restricted Boltzmann Machines (RBM)
TF E … and lists of interesting TFs and interesting SigMols
13
Network Modeling Restricted Boltzmann Machines (RBM)
TF E How to model the topological structure?
14
Network Modeling Restricted Boltzmann Machines (RBM)
Graphical Models Graphical Models can preserve topological structures …
15
Network Modeling Restricted Boltzmann Machines (RBM)
Graphical Models Directed Graph Undirected Graph … … but there are many types of graphical models
16
Network Modeling Restricted Boltzmann Machines (RBM)
Graphical Models Directed Graph Bayesian Networks Undirected Graph … The most common type is the Bayesian Network (BN) …
17
Network Modeling Restricted Boltzmann Machines (RBM)
Bayesian Networks a b c P[a,b,c] 0.1 1 0.9 0.5 … a b ? c Bayesian Networks use joint probabilities …
18
Network Modeling Restricted Boltzmann Machines (RBM)
Bayesian Networks a b c P[a,b,c] 0.1 1 0.9 0.5 … a b c … to represents conditional dependencies in an acyclic graph …
19
Network Modeling Restricted Boltzmann Machines (RBM)
Bayesian Networks b a d c … but the regulation mechanism of a cell can be more complicated
20
Network Modeling Restricted Boltzmann Machines (RBM)
Graphical Models Directed Graph Bayesian Networks Undirected Graph Markov Random Fields … Another type of graphical models are Markov Random Fields (MRF)…
21
Network Modeling Restricted Boltzmann Machines (RBM)
Markov Random Fields Motivation (Ising Model) A set of magnetic dipoles (spins) is arranged in a graph (lattice) where neighbors are coupled with a given strengt ... which emerged with the Ising Model from statistical Physics …
22
Network Modeling Restricted Boltzmann Machines (RBM)
Markov Random Fields Motivation (Ising Model) A set of magnetic dipoles (spins) is arranged in a graph (lattice) where neighbors are coupled with a given strengt ... which uses local energies to calculate new states …
23
Network Modeling Restricted Boltzmann Machines (RBM)
Markov Random Fields Drawback By allowing cyclic dependencies the computational costs explode b a d c … the drawback are high computational costs …
24
Network Modeling Restricted Boltzmann Machines (RBM)
Graphical Models Directed Graph Bayesian Networks Undirected Graph Markov Random Fields Restricted Boltzmann Machines (RBM) … … which can be avoided by using Restricted Boltzmann Machines
25
Network Modeling Restricted Boltzmann Machines (RBM)
Neuron like units RBMs are Artificial Neuronal Networks …
26
Network Modeling Restricted Boltzmann Machines (RBM)
v1 v2 v3 v4 h1 h2 h3 … with two layers: visible units (v) and hidden units (h)
27
Network Modeling Restricted Boltzmann Machines (RBM)
v1 v2 v3 v4 h1 h2 h3 Visible units are strictly connected with hidden units
28
Network Modeling Restricted Boltzmann Machines (RBM)
𝑉 ≔set of visible units 𝑥 𝑣 ≔value of unit 𝑣,∀𝑣∈𝑉 𝑥 𝑣 ∈𝑅, ∀𝑣∈𝑉 In our model the visible units have continuous values …
29
Network Modeling Restricted Boltzmann Machines (RBM)
𝑉 ≔set of visible units 𝑥 𝑣 ≔value of unit 𝑣,∀𝑣∈𝑉 𝑥 𝑣 ∈𝑅, ∀𝑣∈𝑉 𝐻 ≔set of hidden units 𝑥 ℎ ≔value of unit ℎ, ∀ℎ∈𝐻 𝑥 ℎ ∈{0, 1}, ∀ℎ∈𝐻 … and the hidden units binary values
30
Network Modeling Restricted Boltzmann Machines (RBM)
𝑥 𝑣 ~𝑁 𝑏 𝑣 + ℎ 𝑤 𝑣ℎ 𝑥 ℎ , 𝜎 𝑣 ,∀𝑣∈𝑉 𝜎 𝑣 ≔ std. dev. of unit 𝑣 𝑏 𝑣 ≔bias of unit 𝑣 𝑤 𝑣ℎ ≔weight of edge (𝑣,ℎ) Visible units are modeled with gaussians to encode data …
31
Network Modeling Restricted Boltzmann Machines (RBM)
𝑥 𝑣 ~𝑁 𝑏 𝑣 + ℎ 𝑤 𝑣ℎ 𝑥 ℎ , 𝜎 𝑣 ,∀𝑣∈𝑉 𝜎 𝑣 ≔ std. dev. of unit 𝑣 𝑏 𝑣 ≔bias of unit 𝑣 𝑤 𝑣ℎ ≔weight of edge (𝑣,ℎ) 𝑥 ℎ ~sigmoid 𝑏 ℎ + 𝑣 𝑤 𝑣ℎ 𝑥 𝑣 𝜎 𝑣 ,∀ℎ∈𝐻 𝑏 ℎ ≔bias of unit ℎ … and hidden units with simoids to encode dependencies
32
Network Modeling Restricted Boltzmann Machines (RBM)
Learning in Restricted Boltzmann Machines Task: Find dependencies in data ↔ Find configuration of parameters with maximum likelihood (to data) The challenge is to find the configuration of the parameters …
33
Network Modeling Restricted Boltzmann Machines (RBM)
Learning in Restricted Boltzmann Machines Task: Find dependencies in data ↔ Find configuration of parameters with maximum likelihood (to data) In RBMs configurations of parameters have probabilities, that can be defined by local energies Local Energy 1 2 𝐸 𝑣 ≔− ℎ 𝑤 𝑣ℎ 𝑥 𝑣 𝜎 𝑣 𝑥 ℎ + ( 𝑥 𝑣 − 𝑏 𝑣 ) 𝜎 𝑣 2 𝐸 ℎ ≔− 𝑣 𝑤 𝑣ℎ 𝑥 𝑣 𝜎 𝑣 𝑥 ℎ + 𝑥 ℎ 𝑏 ℎ Like in the Ising model the units states correspond to local energies …
34
Network Modeling Restricted Boltzmann Machines (RBM)
Learning in Restricted Boltzmann Machines Task: Find dependencies in data ↔ Find configuration of parameters with maximum likelihood (to data) ↔ Minimize global energy (to data) Global Energy 𝐸≔ 𝑣 𝐸 𝑣 + ℎ 𝐸 ℎ =− 𝑣 ℎ 𝑤 𝑣ℎ 𝑥 𝑣 𝜎 𝑣 𝑥 ℎ + 𝑣 ( 𝑥 𝑣 − 𝑏 𝑣 ) 𝜎 𝑣 ℎ 𝑤 𝑣ℎ 𝑥 𝑣 𝜎 𝑣 𝑥 ℎ … which sum to a global energy, which is our objective function
35
Network Modeling Restricted Boltzmann Machines (RBM)
Learning in Restricted Boltzmann Machines Task: Find dependencies in data ↔ Find configuration of parameters with maximum likelihood (to data) ↔ Minimize global energy (to data) ↔ Perform stochastic gradient descent on 𝜎 𝑣 , 𝑏 𝑣 , 𝑏 ℎ , 𝑤 𝑣ℎ (to data) The optimization can be done using stochastic gradient descent …
36
Network Modeling Restricted Boltzmann Machines (RBM)
Learning in Restricted Boltzmann Machines Task: Find dependencies in data ↔ Find configuration of parameters with maximum likelihood (to data) ↔ Minimize global energy (to data) ↔ Perform stochastic gradient descent on 𝜎 𝑣 , 𝑏 𝑣 , 𝑏 ℎ , 𝑤 𝑣ℎ (to data) Gradient Descent on RBMs The bipartite graph structure allows constrastive divergency learning, using Gibbs-sampling … which has an efficient learning algorithmus
37
Network Modeling Restricted Boltzmann Machines (RBM)
TF E How to model our initial structure as an RBM?
38
Network Modeling Restricted Boltzmann Machines (RBM)
TF E We define S and E as visible Layer …
39
Network Modeling Restricted Boltzmann Machines (RBM)
TF We define S and E as visible Layer …
40
Network Modeling Restricted Boltzmann Machines (RBM)
TF … and TF as hidden Layer
41
Agenda Biological Problem Analysing the regulation of metabolism Network Modeling Restricted Boltzmann Machines (RBM) Implementation & Results python::metapath
42
Results Validation of the results Information about the true regulation Information about the descriptive power of the data
43
Results Validation of the results Information about the true regulation Information about the descriptive power of the data Without this infomation validation can only be done, using simulated data!
44
Results Simulation 1 First of all we need to understand how the modell handles dependencies and noise To demonstrate this we create very simple data with a simple structure
45
What can we expect from this model?
Simulation 1 S TF E What can we expect from this model?
46
… as RBM we get 8 visible and 2 hidden units, fully connected
Simulation 1 S E TF … as RBM we get 8 visible and 2 hidden units, fully connected
47
Let‘s feed the machine with samples …
Simulation 1 Data S E 1,0,0,1 1,0,0,0 1,1,0,0 1,0,1,0 1,0,1,1 0,0,0,0 0,1,0,0 0,0,1,0 0,0,0,1 Let‘s feed the machine with samples …
48
.. to get the calculated parameters (especially the weight matrix)
Simulation 1 Weight matrix TF1 TF2 S1 0,3 0,8 S2 0,5 0,6 S3 1,0 0,1 S4 E1 0,0 E2 E3 E4 0,2 .. to get the calculated parameters (especially the weight matrix)
49
The weights are visualized by the intensity of the edges
Simulation 1 Weight matrix S TF1 TF2 S1 0,3 0,8 S2 0,5 0,6 S3 1,0 0,1 S4 E1 0,0 E2 E3 E4 0,2 TF E The weights are visualized by the intensity of the edges
50
Now we can compare the results with the samples
Simulation 1 Learning samples S S E 1,0,0,1 1,0,0,0 1,1,0,0 1,0,1,0 1,0,1,1 0,0,0,0 0,1,0,0 0,0,1,0 0,0,0,1 TF E Now we can compare the results with the samples
51
There‘s a strong dependency between S3 an E1
Simulation 1 Learning samples S S E 1,0,0,1 1,0,0,0 1,1,0,0 1,0,1,0 1,0,1,1 0,0,0,0 0,1,0,0 0,0,1,0 0,0,0,1 TF E There‘s a strong dependency between S3 an E1
52
S1, S2 and S4 do almost not affect the metabolism …
Simulation 1 Learning samples S S E 1,0,0,1 1,0,0,0 1,1,0,0 1,0,1,0 1,0,1,1 0,0,0,0 0,1,0,0 0,0,1,0 0,0,0,1 TF E S1, S2 and S4 do almost not affect the metabolism …
53
… so we can forget them and get S1,TF1 for our regulation model
Simulation 1 S TF E … so we can forget them and get S1,TF1 for our regulation model
54
We can also take a look at the causal mechanism …
Simulation 1 Weight matrix TF1 TF2 S1 0,3 0,8 S2 0,5 0,6 S3 1,0 0,1 S4 E1 0,0 E2 E3 E4 0,2 We can also take a look at the causal mechanism …
55
The edge (S3, TF1) dominates TF1 …
Simulation 1 Weight matrix TF1 TF2 S1 0,3 0,8 S2 0,5 0,6 S3 1,0 0,1 S4 E1 0,0 E2 E3 E4 0,2 The edge (S3, TF1) dominates TF1 …
56
Also E1 seems to have an effect on S3 (fewer than S3 on E1)
Simulation 1 TF1 e1 TF1 e4 s3 TF1 TF1 TF1 e3 e2 Also E1 seems to have an effect on S3 (fewer than S3 on E1)
57
Of course we want to compare the method with Bayesian Networks
Results Comparing to Bayesian Networks For this purpose we simulate data in three steps Of course we want to compare the method with Bayesian Networks
58
Of course we want to compare the method with Bayesian Networks
Results Comparing to Bayesian Networks Step 1 Choose number of Genes (E+S) and create random bimodal distributed data Of course we want to compare the method with Bayesian Networks
59
Of course we want to compare the method with Bayesian Networks
Results Comparing to Bayesian Networks Step 1 Choose number of Genes (E+S) and create random bimodal distributed data Step 2 Manipulate data in a fixed order Of course we want to compare the method with Bayesian Networks
60
Of course we want to compare the method with Bayesian Networks
Results Comparing to Bayesian Networks Step 1 Choose number of Genes (E+S) and create random bimodal distributed data Step 2 Manipulate data in a fixed order Step 3 Add noise to manipulated data and normalize data Of course we want to compare the method with Bayesian Networks
61
Of course we want to compare the method with Bayesian Networks
Results Comparing to Bayesian Networks Idea ‚melt down‘ the bimodal distribution from very sharp to very noisy Try to find the original causal structure with BN and RBM Measure Accuracy by counting the right and wrong dependencies Of course we want to compare the method with Bayesian Networks
62
Of course we want to compare the method with Bayesian Networks
Results Simulation 2 Step 1: Number of visible nodes 8 (4E, 4S) Create intergradient datasets from sharp to noisy bimodal distribution 𝜎 1 =0.0, 𝜎 1 =0.3, 𝜎 3 =0.9, 𝜎 4 =1.2, 𝜎 4 =1.5 Step 2 + 3: Data Manipulation + add noise 𝑒 1 =0.5 𝑠 𝑠 2 +𝑁(𝜇=0, 𝜎) 𝑒 2 =0.5 𝑠 𝑠 3 +𝑁(𝜇=0, 𝜎) 𝑒 3 =0.5 𝑠 𝑠 4 +𝑁(𝜇=0, 𝜎) 𝑒 4 =0.5 𝑠 𝑠 1 +𝑁(𝜇=0, 𝜎) Of course we want to compare the method with Bayesian Networks
63
Results Simulation 2 RBM Model (𝜎=0.0)
64
Results Simulation 2 Causal Mechanism (𝜎=0.0)
65
Results Simulation 2 Comparison BN / RBM
66
Conclusion Conclusion RBMs are more stable against noise compared to BNs. It has to be assumed that RBMs have high predictive power regarding the regulation mechanisms of cells The drawback are high computational costs Since RBMs are getting more popular (Face recognition / Voice recognition, Image transformation). Many new improvements in facing the computational costs have been made.
67
Acknowledgement eilsLABS PD Dr. Rainer König Prof. Dr Roland Eils Network Modeling Group
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.