Download presentation
Presentation is loading. Please wait.
Published byHamdani Sudjarwadi Modified over 5 years ago
1
Ensuring Security in Critical Infrastructures with Probabilistic Relational Modeling Part 2 Summer School “Hot Topics in Cyber Security” Rome Ralf Möller Joint work with Tanya Braun and Marcel Gehrke Universität zu Lübeck Institute of Information Systems
2
Agenda Part 1 Propositional probabilistic models Part 2
Relational probabilistic models (for cyber security) How to increase modeling power Formal inference problems for LLC (and HOC) How to achieve reasoning scalability How to support learning (to some extent)
3
Defining Probability Distributions
Given: States s = s1, s2, …, sn Distribution ps = (p1, p2, …, pn) Maximum Entropy Principle: W/o further information, select ps , s.t. entropy is maximized w.r.t. k constraints (expected values of features) ps(s) log ps(s) = ps(s) fi(s) = Di i ∈ {1..k}
4
Maximum Entropy Principle
Maximize Lagrange functional for determining ps Partial derivatives of L w.r.t. ps roots: where Z is for normalization (Boltzmann-Gibbs distribution) "Global" modeling: additions/changes to constraints/rules influence the whole joint probability distribution States are mutually exclusive ps(s) fi(s) ps(s)-1) ps(s) Features Weights
5
Propositional Models: Worlds
Characterise world by random variables E.g., 𝐸𝑝𝑖𝑑 Possible values (range): 𝑟 𝐸𝑝𝑖𝑑 = 𝑡𝑟𝑢𝑒,𝑓𝑎𝑙𝑠𝑒 1 world = specific values for variables Probability per world Joint probability distribution 𝑃 𝐹 over all possible worlds 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝐸𝑝𝑖𝑑 Vocabulary 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝟐 𝟗 =𝟓𝟏𝟐 possible worlds
6
Propositional Models: Factors
Full joint 𝑃 𝐹 as product of factors Model 𝐹 𝑟𝑣 𝐹 , 𝑟𝑣(𝑓) random variables in 𝐹,𝑓 Range 𝑟 𝑋 Sparse encoding! 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑃 𝐹 = 1 𝑍 𝑖=1 𝑛 𝑓 𝑖 𝑍= 𝑣∈𝑟(𝑟𝑣 𝐹 ) 𝑖=1 𝑛 𝑓 𝑖 ( 𝜋 𝑟𝑣 𝑓 𝑖 (𝑣)) Hammersley-Clifford Theorem Multiplication: join over shared arguments 𝟕∙ 𝟐 𝟑 =𝟓𝟔 entries
7
Undirected Propositional Models
Factors Potentials 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑆𝑖𝑐𝑘.𝑒𝑣𝑒,𝐸𝑝𝑖𝑑 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 false 5 true 4 6 2 9 Potentials: compatability Table denotes compatibility of values
8
Query Answering (QA): Queries
Marginal distribution 𝑃 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒, 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 Conditional distribution 𝑃 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑) 𝑃 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒=𝑡𝑟𝑢𝑒) Most probable assignment MPE: 𝑅=𝑟𝑣(𝐹) MAP: 𝑅={𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠} 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 Argmax and sum not commutative argmax 𝑣∈𝑟(𝑅) 𝑣∈𝑟(𝑅) 𝑃 𝐹 (𝑣) , 𝑅⊆𝑟𝑣 𝐹
9
QA: Variable Elimination (VE)
Zhang and Poole (1994), Dechter (1999) Eliminate all variables not appearing in query Through summing out E.g., marginal 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑚 1 ∈𝑟 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝑚 2 ∈𝑟(𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 ) 𝑜∈𝑟(𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑) 𝑖∈𝑟(𝑁𝑎𝑡.𝑓𝑖𝑟𝑒) 𝑤∈𝑟(𝑀𝑎𝑛.𝑤𝑎𝑟) 𝑣∈𝑟(𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠) 𝑃 𝐹 Sum out all variables but Travel.eve
10
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟(𝐸𝑝𝑖𝑑) 𝑠∈𝑟(𝑆𝑖𝑐𝑘.𝑒𝑣𝑒) 𝑓 2 1 (𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠) 𝑚 1 ∈𝑟(𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 ) 𝑓 3 1 (𝑒,𝑠, 𝑚 1 ) 𝑚 2 ∈𝑟(𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 ) 𝑓 3 2 (𝑒,𝑠, 𝑚 2 ) 𝑜∈𝑟(𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑) 𝑖∈𝑟(𝑁𝑎𝑡.𝑓𝑖𝑟𝑒) 𝑤∈𝑟(𝑀𝑎𝑛.𝑤𝑎𝑟) 𝑓 1 1 (𝑜, 𝑤,𝑒) 𝑓 1 2 ( 𝑖, 𝑤,𝑒) 𝑓 1 ′ (𝑒) Eliminating m1, m2 is identical 𝑣∈𝑟(𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠) 𝑓 1 3 (𝑜, 𝑣,𝑒) 𝑓 1 4 (𝑖, 𝑣,𝑒) 𝑣∈𝑟(𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠) 𝑓 1 ′ (𝑜, 𝑖, 𝑣,𝑒) 𝑓 1 ′ (𝑜, 𝑖,𝑒)
11
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 3 ′ 𝑒,𝑠 𝑓 3 ′′ 𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 1 ′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ false 50 true 10 34 16 54 36 12 69 𝐸𝑝𝑖𝑑 𝑓 2 1 𝑓 ′ 𝑓 3 ′ 𝑓 3 ′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒
12
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 3 ′ 𝑒,𝑠 𝑓 3 ′′ 𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 1 ′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ ∑ false 50 60 true 10 34 16 54 90 36 12 81 69 𝐸𝑝𝑖𝑑 𝑓 ′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒
13
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 3 ′ 𝑒,𝑠 𝑓 3 ′′ 𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑓′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓′′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 𝑓 1 ′ 𝐸𝑝𝑖𝑑 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑓′′′ false 90 true 100 135 162 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑓′′ false 60 true 50 90 81 Epid False 1.5 True 2 𝑓 ′′ 𝑓 ′′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒
14
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 3 ′ 𝑒,𝑠 𝑓 3 ′′ 𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑓′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓′′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 𝐸𝑝𝑖𝑑 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑓′′′ ∑ false 90 190 true 100 135 297 162 Epid False 1.5 True 2 𝑓 ′′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒
15
QA: Variable Elimination (VE)
Zhang and Poole (1994) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 ∝ 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 𝑓 3 ′ 𝑒,𝑠 𝑓 3 ′′ 𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑠∈𝑟 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒,𝑠 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓 1 ′ 𝑒 𝑓′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 = 𝑒∈𝑟 𝐸𝑝𝑖𝑑 𝑓′′′ 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝑒 =𝑓 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 = 𝑓 𝑛 (𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒) Epid False 1.5 True 2 𝑓 𝑓 𝑛 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑓 false 190 true 297 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑓 𝑛 false 0.39 true 0.61 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒
16
QA: Variable Elimination (VE)
Zhang and Poole (1994) Eliminate all variables not appearing in query Through summing out E.g., conditional 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑= 𝑡𝑟𝑢𝑒) Absorb 𝐸𝑝𝑖𝑑=𝑡𝑟𝑢𝑒 in all factors Sum out remaining variables 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡𝑟𝑢𝑒
17
QA: Evidence Absorption
Zhang and Poole (1994) Absorb 𝐸𝑝𝑖𝑑=𝑡𝑟𝑢𝑒 in all factors, e.g., 𝑓 2 1 Possiby eliminate variable Eliminate remaining variables… 𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝑡𝑟𝑢𝑒 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 false 4 true 6 2 9 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 false true 4 6 2 9 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 false 5 0 true 0 0 4 6 4 0 6 0 2 9 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑓 2 1 false 5 true 4 6 2 9
18
Problem: Models Explode
𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝟕∙𝟐 𝟑 =𝟓𝟔 entries in 7 factors, 9 variables
19
Problem: Models Explode
𝑓 1 1 𝑓 2 1 𝑓 3 1 𝑓 1 2 𝑓 1 3 𝑓 1 4 𝑓 3 2 𝑁𝑎𝑡.𝑓𝑙𝑜𝑜𝑑 𝑁𝑎𝑡.𝑓𝑖𝑟𝑒 𝑀𝑎𝑛.𝑣𝑖𝑟𝑢𝑠 𝑀𝑎𝑛.𝑤𝑎𝑟 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚 1 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑇𝑟𝑎𝑣𝑒𝑙.𝑎𝑙𝑖𝑐𝑒 𝑇𝑟𝑒𝑎𝑡.𝑎𝑙𝑖𝑐𝑒. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑎𝑙𝑖𝑐𝑒. 𝑚 1 𝑆𝑖𝑐𝑘.𝑎𝑙𝑖𝑐𝑒 𝑇𝑟𝑎𝑣𝑒𝑙.𝑏𝑜𝑏 𝑇𝑟𝑒𝑎𝑡.𝑏𝑜𝑏. 𝑚 2 𝑇𝑟𝑒𝑎𝑡.𝑏𝑜𝑏. 𝑚 1 𝑆𝑖𝑐𝑘.𝑏𝑜𝑏 𝟏𝟑∙𝟐 𝟑 =𝟏𝟎𝟒 entries in 13 factors, 17 variables
20
Solution: Lifting Parameterised random variables = PRV
Poole (2003) Parameterised random variables = PRV With logical variables E.g., 𝑋 Possible values (domain): 𝒟 𝑋 ={𝑎𝑙𝑖𝑐𝑒, 𝑒𝑣𝑒, 𝑏𝑜𝑏} Inputs to factors! 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐴) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀)
21
Solution: Lifting Sparse encoding! Factors with PRVs = parfactors
Poole (2003) Factors with PRVs = parfactors (Graphical) Model G E.g., 𝑔 2 Sparse encoding! 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘(𝑋) 𝑔 2 false 5 true 4 6 2 9 𝟑∙𝟐 𝟑 =𝟐𝟒 entries in 3 parfactors, 6 PRVs
22
Solution: Lifting Grounding E.g., 𝑔𝑟(𝑔 2 )= 𝑓 2 1 , 𝑓 2 2 , 𝑓 2 3
Poole (2003) Grounding E.g., 𝑔𝑟(𝑔 2 )= 𝑓 2 1 , 𝑓 2 2 , 𝑓 2 3 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘(𝑒𝑣𝑒) 𝑓 2 1 false 5 true 4 6 2 9 𝑇𝑟𝑎𝑣𝑒𝑙(𝑏𝑜𝑏) 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘(𝑏𝑜𝑏) 𝑓 2 3 false 5 true 4 6 2 9 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘(𝑋) 𝑔 2 false 5 true 4 6 2 9 𝑇𝑟𝑎𝑣𝑒𝑙(𝑎𝑙𝑖𝑐𝑒) 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘(𝑎𝑙𝑖𝑐𝑒) 𝑓 2 2 false 5 true 4 6 2 9
23
Solution: Lifting Joint probability distribution P 𝐺 by grounding
Poole (2003) Joint probability distribution P 𝐺 by grounding 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 𝑃 𝐺 = 1 𝑍 𝑓∈𝑔𝑟(𝐺) 𝑓 𝑍= 𝑣∈𝑟(𝑟𝑣 𝑔𝑟(𝐺) ) 𝑓∈𝑔𝑟(𝐺) 𝑓 𝑖 ( 𝜋 𝑟𝑣 𝑓 𝑖 (𝑣)) What did we win? Use lifted representation during computations, AVOID GROUNDING
24
Recap: Dependency Networks
All devices can be treated identically
25
Application to Cyber-Security
26
QA: Queries Avoid groundings! Marginal distribution
𝑃 𝑆𝑖𝑐𝑘(𝑒𝑣𝑒) 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒,) 𝑇𝑟𝑒𝑎𝑡(𝑒𝑣𝑒, 𝑚 1 ) Conditional distribution 𝑃 𝑆𝑖𝑐𝑘(𝑒𝑣𝑒) 𝐸𝑝𝑖𝑑) 𝑃 𝐸𝑝𝑖𝑑 𝑆𝑖𝑐𝑘 𝑒𝑣𝑒 =𝑡𝑟𝑢𝑒) Most probable assignment MPE: 𝑅=𝑟𝑣(𝑔𝑟(𝐺)) MAP: 𝑅={𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒),𝑀𝑎𝑛(𝑣𝑖𝑟𝑢𝑠)} Avoid groundings! 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 Argmax and sum not commutative argmax 𝑣∈𝑟(𝑅) 𝑣∈𝑟(𝑅) 𝑃 𝐹 (𝑣) , 𝑅⊆𝑟𝑣 𝑔𝑟(𝐺)
27
QA: Lifted VE (LVE) Avoid groundings!
Poole (2003), de Salvo Braz et al. (2005, 2006), Milch et al. (2008), Taghipour et al. (2013, 2013a) Eliminate all variables not appearing in query Lifted summing out Sum out representative instance Exponentiate result for isomorphic instances Preconditions exist! Count conversion (not part of this tutorial) 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 Avoid groundings!
28
QA: LVE in Detail E.g., marginal 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒)
Poole (2003), de Salvo Braz et al. (2005, 2006), Milch et al. (2008), Taghipour et al. (2013a, 2013b) E.g., marginal 𝑃 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) Split w.r.t. 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) Eliminate all non-query variables Normalise 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) 𝑆𝑖𝑐𝑘(𝑒𝑣𝑒) 𝑇𝑟𝑒𝑎𝑡(𝑒𝑣𝑒,𝑀) 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) 𝑆𝑖𝑐𝑘(𝑒𝑣𝑒) 𝑇𝑟𝑒𝑎𝑡(𝑒𝑣𝑒,𝑀) 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑊) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑀) 𝑔 1 𝑔 2 𝑔 3
29
Problem: Many Queries Set of queries Under evidence
𝑃 𝑇𝑟𝑎𝑣𝑒𝑙(𝑒𝑣𝑒) 𝑃 𝑆𝑖𝑐𝑘(𝑏𝑜𝑏) 𝑃 𝑇𝑟𝑒𝑎𝑡(𝑒𝑣𝑒, 𝑚 1 ) 𝑃 𝐸𝑝𝑖𝑑 𝑃 𝑁𝑎𝑡(𝑓𝑙𝑜𝑜𝑑) 𝑃 𝑀𝑎𝑛(𝑣𝑖𝑟𝑢𝑠) Combinations of variables Under evidence 𝑆𝑖𝑐𝑘 𝑋 ′ =𝑡𝑟𝑢𝑒 𝑋 ′ ∈{𝑎𝑙𝑖𝑐𝑒, 𝑒𝑣𝑒} (L)VE restarts from scratch for each query 𝐸𝑝𝑖𝑑 𝑁𝑎𝑡(𝐷) 𝑀𝑎𝑛(𝑀) 𝑇𝑟𝑎𝑣𝑒𝑙(𝑋) 𝑆𝑖𝑐𝑘(𝑋) 𝑇𝑟𝑒𝑎𝑡(𝑋,𝑃) 𝑔 1 𝑔 2 𝑔 3
30
Our Research Incremental algorithms and data structures to speed up query answering for multiple queries (message passing in a so-called junction tree) Queries with variables, Lifted MPE LJT: Lifted Junction Tree Algorithm Tanya Braun and Ralf Möller. Lifted Junction Tree Algorithm. In Proceedings of KI 2016: Advances in Artificial Intelligence, pages 30–42, 2016. Tanya Braun and Ralf Möller. Preventing Groundings and Handling Evidence in the Lifted Junction Tree Algorithm. In Proceedings of KI 2017: Advances in Artificial Intelligence, pages 85–98, 2017. Tanya Braun and Ralf Möller. Counting and Conjunctive Queries in the Lifted Junction Tree Algorithm. In Postproceedings of the 5th International Workshop on Graph Structures for Knowledge Representation and Reasoning, 2017. Tanya Braun and Ralf Möller. Adaptive Inference on Probabilistic Relational Models. In Proceedings of the 31st Australasian Joint Conference on Artificial Intelligence, 2018 Tanya Braun and Ralf Möller. Parameterised Queries and Lifted Query Answering. In IJCAI-18 Proceedings of the 27th International Joint Conference on Artificial Intelligence, 2018. Tanya Braun and Ralf Möller. Lifted Most Probable Explanation. In Proceedings of the International Conference on Conceptual Structures, 2018. Tanya Braun and Ralf Möller. Fusing First-order Knowledge Compilation and the Lifted Junction Tree Algorithm. In Proceedings of KI 2018: Advances in Artificial Intelligence, 2018. Tanya Braun, Rescued from a Sea of Queries: Exact Inference in Probabilistic Relational Models, Dissertation, Universität zu Lübeck, 2019, submitted
31
QA: Most probable assignment
Dawid (1992), Dechter (1999), de Salvo Braz et al. (2006), Apsel and Brafman (2012), Braun and Möller (2018) To all variables: most probable explanation (MPE) (L)VE: maxing out (instead of summing out) (L)JT: message calculation by maxing out Ismorphic instances: identical most probable assignment As before: construction, evidence entering Message passing Only one message pass (from periphery to centre) Max out remaining variables at centre argmax 𝑟𝑣(𝐺) 𝑃 𝐺
32
QA: Most probable assignment
Dechter (1999), Sharma et al. (2018) Braun and Möller (2018) To subset of variables: maximum a posteriori (MAP) argmax 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒,𝐸𝑝𝑖𝑑 𝑟 𝑁𝑎𝑡 𝐷 ,𝑀𝑎𝑛 𝑊 ,𝑆𝑖𝑐𝑘 𝑋 ,𝑇𝑟𝑒𝑎𝑡 𝑋,𝑃 , 𝑟 𝑇𝑟𝑎𝑣𝑒𝑙 𝑋 , 𝑋≠𝑒𝑣𝑒 𝑃 𝐺 MAP a more general case of MPE ∑ and max not commutative! As before: construction, evidence entering, message passing (with summing out) Answer MAP query: get submodel (as before) Sum out non-query variables, max out query variables Assignment to complete parclusters safe
33
Propositional: Dynamic Model
Temporal pattern Instantiate and unroll pattern Infer on unrolled model 𝑓 𝐸 𝑓 𝑡−1 2 𝑓 𝑡−1 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡−1 𝐸𝑝𝑖𝑑 𝑡−1 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡−1 𝑓 𝑡 2 𝑓 𝑡 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡 𝐸𝑝𝑖𝑑 𝑡 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡
34
Dynamic Model: Inference Problems
Marginal distribution query: 𝑃 𝐴 𝜋 𝑖 𝐸 0:𝑡 ) w.r.t. the model: Hindsight: 𝜋<𝑡 (was there an epidemic 𝜋−𝑡 days ago?) Filtering: 𝜋=𝑡 (is there an currently an epidemic?) Prediction: 𝜋>𝑡 (is there an epidemic in 𝜋−𝑡 days?) 𝑓 𝐸 𝑓 𝑡−1 2 𝑓 𝑡−1 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡−1 𝐸𝑝𝑖𝑑 𝑡−1 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡−1 𝑓 𝑡 2 𝑓 𝑡 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡 𝐸𝑝𝑖𝑑 𝑡 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡
35
Propositional: Dynamic Model
Problems with unrolling: Huge model (unrolled for T timesteps) Redundant temporal information and calculations Redundant calculations to answer multiple queries 𝑓 𝐸 𝑓 𝑡−1 2 𝑓 𝑡−1 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡−1 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡−1 𝐸𝑝𝑖𝑑 𝑡−1 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡−1 𝑓 𝑡 2 𝑓 𝑡 3 𝑇𝑟𝑎𝑣𝑒𝑙.𝑒𝑣𝑒 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚2 𝑡 𝑇𝑟𝑒𝑎𝑡.𝑒𝑣𝑒. 𝑚1 𝑡 𝐸𝑝𝑖𝑑 𝑡 𝑆𝑖𝑐𝑘.𝑒𝑣𝑒 𝑡
36
Lifted: Dynamic Model Marginal distribution query: 𝑃 𝐴 𝜋 𝑖 𝐸 0:𝑡 ) w.r.t. the model: Hindsight: 𝜋<𝑡 (was there an epidemic 𝜋−𝑡 days ago?) Filtering: 𝜋=𝑡 (is there an currently an epidemic?) Prediction: 𝜋>𝑡 (is there an epidemic in 𝜋−𝑡 days?) 𝑔 𝐸 𝐸𝑝𝑖𝑑 𝑡−1 𝐸𝑝𝑖𝑑 𝑡 𝑇𝑟𝑎𝑣𝑒𝑙(X) 𝑡−1 𝑇𝑟𝑒𝑎𝑡 (X,M) 𝑡−1 𝑔 𝑡−1 2 𝑔 𝑡−1 3 𝑔 𝑡 2 𝑔 𝑡 3 𝑇𝑟𝑎𝑣𝑒𝑙(X) 𝑡 𝑇𝑟𝑒𝑎𝑡 (X,M) 𝑡 𝑆𝑖𝑐𝑘(𝑋) 𝑡−1 𝑆𝑖𝑐𝑘(𝑋) 𝑡
37
Our Research Lifted dynamic probabilistic relational models
LDJT: Lifted Dynamic Junction Tree Algorithm Gehrke et al. (2018) Marcel Gehrke, Tanya Braun, and Ralf Möller. Lifted Dynamic Junction Tree Algorithm. In Proceedings of the International Conference on Conceptual Structures, 2018. Gehrke et al. (2018a) Marcel Gehrke, Tanya Braun, and Ralf Möller. Answering Hindsight Queries with Lifted Dynamic Junction Trees. In 8th International Workshop on Statistical Relational AI at the 27th International Joint Conference on Artificial Intelligence, 2018.
38
Cyber-Security Once Again
39
Recap: HMMs for Cyber-Security
DNS hidden state lbs db webserver lbs lbs DNS webserver webserver observation lbs db Proxy
40
Application of PRMs Determine what has happened given evidence available Reduction to formal inference problem(s) Query answering MPE Determine Higher Order Event Patterns Match of MPE with Higher Order Patterns Longest Common Subsequence With specific equivalence relations
41
Something wrong with these models?
Distribution only defined if constants are given Need domains of logvars (used in parametrized random variables) Transfer to different domains not directly obvious
42
And what we might do about it
Distribution over domain sizes Expected query answers
43
Insights Probabilistic Relational Models (PRMs)
Statistical Relational AI aka Stochastic Relational AI (StaRAI) We made huge progress to solve practical problems in cyber security Solid theoretical framework New algorithms for practical scalability Hope to have convinced you that StaRAI is worth studying
44
Overall References Apsel and Brafman (2012) Udi Apsel and Ronen I. Brafman. Exploiting Uniform Assignments in First-Order MPE. Proceedings of the 28th Conference on Uncertainty in Artificial Intelligence, Dawid (1992) Alexander Philip Dawid. Applications of a General Propagation Algorithm for Probabilistic Expert Systems. Statistics and Computing, 2(1):25–36, Dechter (1999) Rina Dechter. Bucket Elimination: A Unifying Framework for Probabilistic Inference. In Learning and Inference in Graphical Models, pages 75–104. MIT Press, De Salvo Braz et al. (2005) Rodrigo de Salvo Braz, Eyal Amir, and Dan Roth. Lifted First-order Probabilistic Inference. IJCAI-05 Proceedings of the 19th International Joint Conference on Artificial Intelligence, De Salvo Braz et al. (2006) Rodrigo de Salvo Braz, Eyal Amir, and Dan Roth. MPE and Partial Inversion in Lifted Probabilistic Variable Elimination. AAAI-06 Proceedings of the 21st Conference on Artificial Intelligence, 2006. *I would like to thank the IFIS team for their work and contributions! The focus was on specific topics and global references in the tutorial are necessarily incomplete. Apologies to anyone whose work I did not cite.
45
Overall References Milch et al. (2008) Brian Milch, Luke S. Zettelmoyer, Kristian Kersting, Michael Haimes, and Leslie Pack Kaelbling. Lifted Probabilistic Inference with Counting Formulas AAAI- 08 Proceedings of the 23rd Conference on Artificial Intelligence, Poole (2003) David Poole. First-order Probabilistic Inference. In Proceedings of the 18th International Joint Conference on Artificial Intelligence, Sharma et al. (2018) Vishal Sharma, Noman Ahmed Sheikh, Happy Mittal, Vibhav Gogate, and Parag Singla. Lifted Marginal MAP Inference. In Proceedings of the 34th Conference on Uncertainty in Artificial Intelligence 4, Taghipour et al. (2013) Nima Taghipour, Jesse Davis, and Hendrik Blockeel. A Generalized Counting for Lifted Variable Elimination. In Proceedings of the International Conference on Inductive Logic Programming, pages 107–122, Taghipour et al. (2013a) Nima Taghipour, Daan Fierens, Jesse Davis, and Hendrik Blockeel. Lifted Variable Elimination: Decoupling the Operators from the Constraint Language. Journal of Artificial Intelligence Research, 47(1):393–439, Taghipour et al. (2013b) Nima Taghipour, Jesse Davis, and Hendrik Blockeel. First- order decomposition trees. In Advances in Neural Information Processing Systems 26, pages 1052–1060. Curran Associates, Inc., Zhang and Poole (1994) Nevin L. Zhang and David Poole. A Simple Approach to Bayesian Network Computations. In Proceedings of the 10th Canadian Conference on Artificial Intelligence, pages 171–178, 1994. *I would like to thank the IFIS team for their work and contributions! The focus was on specific topics and global references in the tutorial are necessarily incomplete. Apologies to anyone whose work I did not cite.
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.