Random Regression: Example Target Query: P(gender(sam) = F)? Sam is friends with Bob and Anna. Unnormalized Probability: Oliver Schulte, Hassan Khosravi,

Slides:



Advertisements
Similar presentations
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Discriminative Structure and Parameter.
Advertisements

Online Max-Margin Weight Learning for Markov Logic Networks Tuyen N. Huynh and Raymond J. Mooney Machine Learning Group Department of Computer Science.
CPSC 322, Lecture 30Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 30 March, 25, 2015 Slide source: from Pedro Domingos UW.
Learning Bayes Nets Based on Conditional Dependencies Oliver Schulte Simon Fraser University Vancouver, Canada ` with Wei.
Relational Representations Daniel Lowd University of Oregon April 20, 2015.
The IMAP Hybrid Method for Learning Gaussian Bayes Nets Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada
Undirected Probabilistic Graphical Models (Markov Nets) (Slides from Sam Roweis)
A Hierarchy of Independence Assumptions for Multi-Relational Bayes Net Classifiers School of Computing Science Simon Fraser University Vancouver, Canada.
Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u
Speeding Up Inference in Markov Logic Networks by Preprocessing to Reduce the Size of the Resulting Grounded Network Jude Shavlik Sriraam Natarajan Computer.
Markov Networks.
Efficient Weight Learning for Markov Logic Networks Daniel Lowd University of Washington (Joint work with Pedro Domingos)
Markov Logic: A Unifying Framework for Statistical Relational Learning Pedro Domingos Matthew Richardson
Speaker:Benedict Fehringer Seminar:Probabilistic Models for Information Extraction by Dr. Martin Theobald and Maximilian Dylla Based on Richards, M., and.
School of Computing Science Simon Fraser University Vancouver, Canada.
Learning Markov Network Structure with Decision Trees Daniel Lowd University of Oregon Jesse Davis Katholieke Universiteit Leuven Joint work with:
Modelling Relational Statistics With Bayes Nets School of Computing Science Simon Fraser University Vancouver, Canada Tianxiang Gao Yuke Zhu.
A Tractable Pseudo-Likelihood for Bayes Nets Applied To Relational Data Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada.
Pseudo-Likelihood for Relational Data Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada To appear at SIAM SDM conference.
CSE 574 – Artificial Intelligence II Statistical Relational Learning Instructor: Pedro Domingos.
Spot Price Models. Spot Price Dynamics An examination of the previous graph shows several items of interest. –Price series exhibit spikes with seasonal.
Big Ideas in Cmput366. Search Blind Search Iterative deepening Heuristic Search A* Local and Stochastic Search Randomized algorithm Constraint satisfaction.
CSE 574: Artificial Intelligence II Statistical Relational Learning Instructor: Pedro Domingos.
Relational Models. CSE 515 in One Slide We will learn to: Put probability distributions on everything Learn them from data Do inference with them.
1 Human Detection under Partial Occlusions using Markov Logic Networks Raghuraman Gopalan and William Schwartz Center for Automation Research University.
Learning, Logic, and Probability: A Unified View Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Stanley Kok, Matt.
Scalable Text Mining with Sparse Generative Models
1 Learning the Structure of Markov Logic Networks Stanley Kok & Pedro Domingos Dept. of Computer Science and Eng. University of Washington.
Statistical Relational Learning Pedro Domingos Dept. Computer Science & Eng. University of Washington.
Boosting Markov Logic Networks
Summary of the Bayes Net Formalism David Danks Institute for Human & Machine Cognition.
A simple method for multi-relational outlier detection Sarah Riahi and Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada.
Markov Logic Parag Singla Dept. of Computer Science University of Texas, Austin.
Learning Bayesian Networks for Relational Databases Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada.
A Comparison Between Bayesian Networks and Generalized Linear Models in the Indoor/Outdoor Scene Classification Problem.
第十讲 概率图模型导论 Chapter 10 Introduction to Probabilistic Graphical Models
Markov Logic And other SRL Approaches
Transfer in Reinforcement Learning via Markov Logic Networks Lisa Torrey, Jude Shavlik, Sriraam Natarajan, Pavan Kuppili, Trevor Walker University of Wisconsin-Madison,
Oliver Schulte Machine Learning 726 Bayes Net Classifiers The Naïve Bayes Model.
Tuffy Scaling up Statistical Inference in Markov Logic using an RDBMS
Markov Logic Networks Pedro Domingos Dept. Computer Science & Eng. University of Washington (Joint work with Matt Richardson)
1 Computing Trust in Social Networks Huy Nguyen Lab seminar April 15, 2011.
Learning the Structure of Related Tasks Presented by Lihan He Machine Learning Reading Group Duke University 02/03/2006 A. Niculescu-Mizil, R. Caruana.
Learning With Bayesian Networks Markus Kalisch ETH Zürich.
METU Informatics Institute Min720 Pattern Classification with Bio-Medical Applications Lecture notes 9 Bayesian Belief Networks.
Computing & Information Sciences Kansas State University Data Sciences Summer Institute Multimodal Information Access and Synthesis Learning and Reasoning.
Slides for “Data Mining” by I. H. Witten and E. Frank.
IE 300, Fall 2012 Richard Sowers IESE. 8/30/2012 Goals: Rules of Probability Counting Equally likely Some examples.
FACTORBASE: SQL for Multi-Relational Model Learning Zhensong Qian and Oliver Schulte, Simon Fraser University, Canada 1.Qian, Z.; Schulte, O. The BayesBase.
Lecture 2: Statistical learning primer for biologists
CPSC 322, Lecture 30Slide 1 Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 30 Nov, 23, 2015 Slide source: from Pedro Domingos UW.
Challenge Paper: Marginal Probabilities for Instances and Classes Oliver Schulte School of Computing Science Simon Fraser University Vancouver, Canada.
Learning Bayes Nets Based on Conditional Dependencies Oliver Schulte Department of Philosophy and School of Computing Science Simon Fraser University Vancouver,
1 Relational Factor Graphs Lin Liao Joint work with Dieter Fox.
Progress Report ekker. Problem Definition In cases such as object recognition, we can not include all possible objects for training. So transfer learning.
From Relational Statistics to Degrees of Belief School of Computing Science Simon Fraser University Vancouver, Canada Tianxiang Gao Yuke Zhu.
Dependency Networks for Inference, Collaborative filtering, and Data Visualization Heckerman et al. Microsoft Research J. of Machine Learning Research.
Scalable Statistical Relational Learning for NLP William Y. Wang William W. Cohen Machine Learning Dept and Language Technologies Inst. joint work with:
Learning Bayesian Networks for Complex Relational Data
First-Order Bayesian Networks
Relational Bayes Net Classifiers
General Graphical Model Learning Schema
Recovering Temporally Rewiring Networks: A Model-based Approach
Markov Networks.
Fast Learning of Relational Dependency Networks
Lecture 20: Scaling Inference with RDBMS
Discriminative Probabilistic Models for Relational Data
CS639: Data Management for Data Science
Markov Networks.
Sequential Learning with Dependency Nets
Presentation transcript:

Random Regression: Example Target Query: P(gender(sam) = F)? Sam is friends with Bob and Anna. Unnormalized Probability: Oliver Schulte, Hassan Khosravi, Tianxiang Gao and Yuke Zhu School of Computing Science Simon Fraser University, Vancouver, Canada Project Website: Bayes Net Inference: The Cyclicity Problem In the presence of recursive dependencies (autocorrelations), a ground first-order Bayes net template may contain cycles (Schulte et al. 2012, Domingos and Richardson 2007). Overview How to define Bayes net relational inference with ground cycles? 1.Define Markov blanket probabilities: the probability of a target node value given its Markov blanket. 2.Random regression: compute the expected probability over a random instantiation of the Markov blanket. 3.Closed-form result: equivalent to a log-linear regression model that uses Markov blanket feature frequencies rather than counts. 4.Random regression works well with Bayes net parameters  very fast parameter estimation. References 1.H. Khosravi, O. Schulte, T. Man, X. Xu, B. Bina, Structure learning for Markov logic networks with many descriptive attributes, AAAI, 2010, pp. 487– O. Schulte and H. Khosravi. Learning graphical models for relational data via lattice search. Machine Learning, 88:3, , Schulte, O.; Khosravi, H. & Man, T. Learning Directed Relational Models With Recursive Dependencies Machine Learning, 2012, Forthcoming. 4.Domingos, P., Richardson, M.: Markov logic: A unifying framework for statistical relational learning. in Statistical Relational Learning, Evaluation Use Learn-and-Join Algorithm for Bayes net structure learning (Khosravi et al. 2010, Schulte and Khosravi 2012). MBN Convert Bayes net to Markov net, use Alchemy to learn weights in log-linear model with Markov Blanket feature counts. CP+Count Use log-conditional probs as weights in log-linear model with Markov Blanket feature counts. CP+Frequency. Use log-conditional probs as weights in log-linear model with Markov Blanket feature frequencies (= random regression). Learning Times Conditional Log-likelihood Quick Summary Plot Average performance over databases Smaller Learning Time is better. Bigger CLL is better. Conclusion Random regression: principled way to define relational Bayes net inferences even with ground cycles. Closed form evaluation: log-linear model with feature frequencies. Bayes net conditional probabilities are fast to compute, interpretable and local. Using feature frequencies rather than counts addresses the balancing problem: in the count model, features with more groundings carry exponentially more weights. Relational Random Regression for Bayes Nets Closed Form Proposition The random regression value can be obtained by multiplying the probability associated with each Markov blanket state, raised to the frequency of the state. Example: P(g(sam) = F|mb) = α P(cd(sam) = T|gd(sam) = F) x [P(g(sam) = F|g(anna)=F, Fr(sam,anna) = T) x P(g(sam) = F|g(bob) = M, Fr(sam,bob) = T)] 1/2 = 70% x [60% x 40%] 1/2 = 0.34 = e Relational regression in graphical models Bayes net  dependency net, use geometric mean as combining rule = log-linear model with frequencies = random regression. Bayes net  Markov net, use standard Markov network regression = log-linear model with counts. Example: P(g(sam) = F|mb) = α 70% x 60% x 40% = Methods Compared coffee_dr(sam) Friend(sam,Y) gender(sam) gender(Y) Bayes Net Markov Net Dependency Net Log-linear Model with Frequencies random regression Log-linear Model with Counts product geo.mean coffee_dr(X) Friend(X,Y) gender(X) gender(Y) P(g(X) = F |g(Y) =F, F(X,Y) = T)=.6 P(g(X) = M|g(Y) = M, F(X,Y) = T) =.6... P(cd(X) = T|g(X) = F) =.7 P(cd(X) = T|g(X) = M) =.3 NameGenderCoffee Drinker AnnaFT Sam?F BobMF Regression Graph

coffee_dr(sam) Friend(sam,Y) gender(sam) gender(Y) P(g(X) = F |g(Y) =F, F(X,Y) = T)=.6 P(g(X) = M|g(Y) = M, F(X,Y) = T) =.6... P(cd(X) = T|g(X) = F) =.7 P(cd(X) = T|g(X) = M) =.3 Schulte, O.; Khosravi, H. & Man, T. Learning Directed Relational Models With Recursive Dependencies Machine Learning, 2012, Forthcoming. Domingos, P., Richardson, M.: Markov logic: A unifying framework for statistical relational learning. NameGenderCoffee Drinker AnnaFT Sam?F BobMF Pe op le NameGenderCoffee Drinker AnnaFT Sam?F BobMF Regression Graph: