First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International.

Slides:



Advertisements
Similar presentations
1 Knowledge Representation Introduction KR and Logic.
Advertisements

1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.
University of Texas at Austin Machine Learning Group Department of Computer Sciences University of Texas at Austin Discriminative Structure and Parameter.
Bayesian Abductive Logic Programs Sindhu Raghavan Raymond J. Mooney The University of Texas at Austin 1.
Bayesian Networks CSE 473. © Daniel S. Weld 2 Last Time Basic notions Atomic events Probabilities Joint distribution Inference by enumeration Independence.
1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.
Logic Use mathematical deduction to derive new knowledge.
Lirong Xia Approximate inference: Particle filter Tue, April 1, 2014.
Chapter 15 Probabilistic Reasoning over Time. Chapter 15, Sections 1-5 Outline Time and uncertainty Inference: ltering, prediction, smoothing Hidden Markov.
Review Markov Logic Networks Mathew Richardson Pedro Domingos Xinran(Sean) Luo, u
Speeding Up Inference in Markov Logic Networks by Preprocessing to Reduce the Size of the Resulting Grounded Network Jude Shavlik Sriraam Natarajan Computer.
Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz, Eyal Amir and Dan Roth This research is supported by ARDA’s AQUAINT Program, by NSF grant.
Plan Recognition with Multi- Entity Bayesian Networks Kathryn Blackmond Laskey Department of Systems Engineering and Operations Research George Mason University.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) March, 16, 2009.
A Probabilistic Framework for Information Integration and Retrieval on the Semantic Web by Livia Predoiu, Heiner Stuckenschmidt Institute of Computer Science,
10/28 Temporal Probabilistic Models. Temporal (Sequential) Process A temporal process is the evolution of system state over time Often the system state.
Probabilistic Reasoning Copyright, 1996 © Dale Carnegie & Associates, Inc. Chapter 14 (14.1, 14.2, 14.3, 14.4) Capturing uncertain knowledge Probabilistic.
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
Constructing Belief Networks: Summary [[Decide on what sorts of queries you are interested in answering –This in turn dictates what factors to model in.
5/25/2005EE562 EE562 ARTIFICIAL INTELLIGENCE FOR ENGINEERS Lecture 16, 6/1/2005 University of Washington, Department of Electrical Engineering Spring 2005.
CS 188: Artificial Intelligence Spring 2007 Lecture 14: Bayes Nets III 3/1/2007 Srini Narayanan – ICSI and UC Berkeley.
CS 188: Artificial Intelligence Fall 2006 Lecture 17: Bayes Nets III 10/26/2006 Dan Klein – UC Berkeley.
Announcements Homework 8 is out Final Contest (Optional)
1 Multi-Entity Bayesian Networks Without Multi-Tears Bayesian Networks Seminar Jan 3-4, 2007.
Topics Combining probability and first-order logic –BLOG and DBLOG Learning very complex behaviors –ALisp: hierarchical RL with partial programs State.
Notes for Chapter 12 Logic Programming The AI War Basic Concepts of Logic Programming Prolog Review questions.
Mean Field Inference in Dependency Networks: An Empirical Study Daniel Lowd and Arash Shamaei University of Oregon.
Relational Probability Models Brian Milch MIT 9.66 November 27, 2007.
Dynamic Probabilistic Relational Models Paper by: Sumit Sanghai, Pedro Domingos, Daniel Weld Anna Yershova Presentation slides are adapted.
1 MPE and Partial Inversion in Lifted Probabilistic Variable Elimination Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir.
Markov Logic And other SRL Approaches
Bayesian networks. Motivation We saw that the full joint probability can be used to answer any question about the domain, but can become intractable as.
1 Chapter 14 Probabilistic Reasoning. 2 Outline Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions.
2 Syntax of Bayesian networks Semantics of Bayesian networks Efficient representation of conditional distributions Exact inference by enumeration Exact.
Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International joint work with Eyal Amir and Dan Roth.
Bayes’ Nets: Sampling [These slides were created by Dan Klein and Pieter Abbeel for CS188 Intro to AI at UC Berkeley. All CS188 materials are available.
Propositional Calculus CS 270: Mathematical Foundations of Computer Science Jeremy Johnson.
Announcements Project 4: Ghostbusters Homework 7
1 Lifted First-Order Probabilistic Inference Rodrigo de Salvo Braz University of Illinois at Urbana-Champaign with Eyal Amir and Dan Roth.
Kansas State University Department of Computing and Information Sciences CIS 730: Introduction to Artificial Intelligence Lecture 17 Wednesday, 01 October.
Simultaneously Learning and Filtering Juan F. Mancilla-Caceres CS498EA - Fall 2011 Some slides from Connecting Learning and Logic, Eyal Amir 2006.
4. Particle Filtering For DBLOG PF, regular BLOG inference in each particle Open-Universe State Estimation with DBLOG Rodrigo de Salvo Braz*, Erik Sudderth,
BLOG: Probabilistic Models with Unknown Objects Brian Milch, Bhaskara Marthi, Stuart Russell, David Sontag, Daniel L. Ong, Andrey Kolobov University of.
Tractable Inference for Complex Stochastic Processes X. Boyen & D. Koller Presented by Shiau Hong Lim Partially based on slides by Boyen & Koller at UAI.
Computer Science CPSC 322 Lecture 22 Logical Consequences, Proof Procedures (Ch 5.2.2)
Inference Algorithms for Bayes Networks
First-Order Probabilistic Inference Rodrigo de Salvo Braz.
CSE 473 Uncertainty. © UW CSE AI Faculty 2 Many Techniques Developed Fuzzy Logic Certainty Factors Non-monotonic logic Probability Only one has stood.
Some Thoughts to Consider 5 Take a look at some of the sophisticated toys being offered in stores, in catalogs, or in Sunday newspaper ads. Which ones.
CPSC 322, Lecture 26Slide 1 Reasoning Under Uncertainty: Belief Networks Computer Science cpsc322, Lecture 27 (Textbook Chpt 6.3) Nov, 13, 2013.
Probabilistic Reasoning Inference and Relational Bayesian Networks.
Chapter 12. Probability Reasoning Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Happy Mittal Advisor : Parag Singla IIT Delhi Lifted Inference Rules With Constraints.
Bogdan Moldovan, Ingo Thon, Jesse Davis, and Luc de Raedt
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Artificial Intelligence
CS 4/527: Artificial Intelligence
Propositional Calculus: Boolean Algebra and Simplification
CAP 5636 – Advanced Artificial Intelligence
Lifted First-Order Probabilistic Inference [de Salvo Braz, Amir, and Roth, 2005] Daniel Lowd 5/11/2005.
Anytime Lifted Belief Propagation
Logic Use mathematical deduction to derive new knowledge.
Relational Probability Models
Inference Inference: calculating some useful quantity from a joint probability distribution Examples: Posterior probability: Most likely explanation: B.
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
CAP 5636 – Advanced Artificial Intelligence
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence
CS 188: Artificial Intelligence Fall 2008
Intelligent Systems (AI-2) Computer Science cpsc422, Lecture 12
Presentation transcript:

First-Order Probabilistic Inference Rodrigo de Salvo Braz SRI International

2 A remark Feel free to ask clarification questions!

3 Slides online  Just search for “Rodrigo de Salvo Braz” and check at the presentations page. (

4 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

5 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

6 How to solve commonsense reasoning How to solve Natural Language Processing How to solve Planning How to solve Vision Domain Knowledge Inference and Learning Language Knowledge Inference and Learning Domain & Planning Knowledge Inference and Learning Objects & Optics Knowledge Inference and Learning Abstracting Inference and Learning Commonsense reasoning Natural Language Processing PlanningVision... Inference and Learning AI Problems

7 Inference and Learning What capabilities should it have? Inference and Learning It should represent and use: predicates (relations) and quantification over objects: 8 X male(X) Ç female(X) 8 X male(X) ) sick(X) sick(p121) varying degrees of uncertainty: 8 X { male(X) ) sick(X) } since it may hold often but not absolutely. essential for language, vision, pattern recognition, commonsense reasoning etc modal knowledge, utilities, and more.

8 Abstracting Inference and Learning Domain knowledge: male(X) Ç female(X) { male(X) ) sick(X) } sick(o121)... Commonsense reasoning Language knowledge: { sentence(S) Æ verb(S,”had”) Æ object(S,”fever”) subject(S,X) ) fever(X) }... Natural Language Processing Inference and Learning Sharing inference and learning module

9 Abstracting Inference and Learning Domain knowledge: male(X) Ç female(X) { male(X) ) sick(X) } sick(o121)... Commonsense reasoning WITH Natural Language Processing Language knowledge: { sentence(S) Æ verb(S,”had”) Æ object(S,”fever”) subject(S,X) ) fever(X) } Inference and Learning Joint solution made simpler verb(s13,”had”), object(S, “fever”), subject(s13,o121)...

10 Standard solutions fall short  Logic Has objects and properties; but statements are absolute;  Graphical models and machine learning Have varying degrees of uncertainty; but propositional; treating objects and quantification is awkward.

11 Standard solutions fall short Graphical models and objects male(X) Ç female(X) { male(X) ) sick(X) } { friend(X,Y) Æ male(X) ) male(Y) } sick(o121) sick(o135) sick_121 male_o121female_o121 friend_o121_o135 male_o135female_o135 sick_o135 Transformation (propositionalization) not part of graphical model framework; Graphical model have only random variables, not objects; Different data creates distinct, formally unrelated model.

12 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

13 First-Order Probabilistic Inference  Many languages have been proposed  Knowledge-based model construction (Breese, 1992)  Probabilistic Logic Programming (Ng & Subrahmanian, 1992)  Probabilistic Logic Programming (Ngo and Haddawy, 1995)  Probabilistic Relational Models (Friedman et al., 1999)  Relational Dependency Networks (Neville & Jensen, 2004)  Bayesian Logic (BLOG) (Milch & Russell, 2004)  Bayesian Logic Programs (Kersting & DeRaedt, 2001)  DAPER (Heckerman, Meek & Koller, 2007)  Markov Logic Networks (Richardson & Domingos, 2004)  “Smart” propositionalization is the main method.

14 Propositionalization  Bayesian Logic Programs Prolog-like statements such as medicine(Hospital) | in(Patient,Hospital), sick(Patient). sick(Patient) | in(Patient,Hospital), exposed(Hospital). associated with CPTs and combination rules. | instead of :-

15 Propositionalization medicine(Hospital) | in(Patient,Hospital), sick(Patient). (plus CPT) BN grounded from and-or tree: medicine(hosp13) in(john,hosp13) sick(john) in(peter,hosp13) sick(peter)...

16 Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) in(Patient,Hospital)sick(Patient) medicine(Hospital) in(Patient,Hospital)exposed(Hospital) sick(Patient)

17 Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) in(Patient,Hospital)sick(Patient) medicine(Hospital) in(john,hosp13)sick(john) medicine(hosp13) in(peter,hosp13)sick(peter) medicine(hosp13)... first-order fragments instantiated

18 Propositionalization Multi-Entity Bayesian Networks (Laskey, 2004) in(john,hosp13)sick(john) medicine(hosp13) in(peter,hosp13)sick(peter)... in(john,hosp13)sick(john) medicine(hosp13) in(peter,hosp13)sick(peter) medicine(hosp13)...

19 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

20 Lifted Inference medicine(hosp13) medicine(Hospital) ( in(Patient,Hospital) Æ sick(Patient) { sick(Patient) ( in(Patient,Hospital) Æ exposed(Hospital) } sick(Patient) exposed(hosp13) in(Patient, hosp13) Unbound, general variable Faster More compact More intuitive Higher level – more information/structure available for optimization

21 Bayesian Networks (directed) epidemic P(sick_john, sick_mary, sick_bob, epidemic) = P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic) sick_john sick_marysick_bob P(sick_bob | epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic)

22 Factor Networks (undirected) epidemic_france P(epi_france, epi_belgium, epi_uk, epi_germany) /   (epi_france, epi_belgium) *   (epi_france, epi_uk) *   (epi_france, epi_germany) *   (epi_belgium, epi_germany) epidemic_belgiumepidemic_germany epidemic_uk     factor, or potential function

23 Bayesian Nets as Factor Networks epidemic P(sick_john, sick_mary, sick_bob, epidemic) / P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic) sick_john sick_marysick_bob P(sick_bob | epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic)

24 Inference: Marginalization epidemic P(sick_john) /  epidemic  sick_mary  sick_bob P(sick_john | epidemic) * P(sick_mary | epidemic) * P(sick_bob | epidemic) * P(epidemic) sick_john sick_marysick_bob P(sick_bob | epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic)

25 epidemic sick_john sick_marysick_bob P(sick_bob | epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic) P(sick_john) /  epidemic P(sick_john | epidemic) * P(epidemic) *  sick_mary P(sick_mary | epidemic) *  sick_bob P(sick_bob | epidemic) Inference: Variable Elimination (VE)

26 epidemic sick_john sick_mary   (epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic) P(sick_john) /  epidemic P(sick_john | epidemic) * P(epidemic) *  sick_mary P(sick_mary | epidemic) *   (epidemic) Inference: Variable Elimination (VE)

27 epidemic sick_john sick_mary   (epidemic) P(epidemic) P(sick_john | epidemic) P(sick_mary | epidemic) P(sick_john) /  epidemic P(sick_john | epidemic) * P(epidemic) *   (epidemic) *  sick_mary P(sick_mary | epidemic) Inference: Variable Elimination (VE)

28 epidemic sick_john   (epidemic) P(epidemic) P(sick_john | epidemic) P(sick_john) /  epidemic P(sick_john | epidemic) * P(epidemic) *   (epidemic) *   (epidemic)   (epidemic) Inference: Variable Elimination (VE)

29 sick_john   (sick_john) P(sick_john) /   (sick_john) Inference: Variable Elimination (VE)

30 A Factor Network sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……

31 First-order Representation sick(P,D) hospital(P) epidemic(D) Atoms represent a set of random variables Logical Variables parameterize random variables Parfactors, for parameterized factors Contraints to logical variables P  john

32 Semantics sick(mary,measles) hospital(mary) epidemic(measles)epidemic(flu) sick(mary,flu) … … sick(bob,measles) hospital(bob) sick(bob,flu) …… … …… ……   sick(mary,measles), epidemic(measles))

33 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

34 Inversion Elimination (IE) sick(P, D) epidemic(D) D  measles IE … sick(john, flu) epidemic(flu) sick(mary, rubella) epidemic(rubella) … Grounding … sick(john, flu) epidemic(flu) sick(mary, rubella) epidemic(rubella) … VE Abstraction epidemic(D) D  measles sick(P, D)  sick(P,D)  (epidemic(D),sick(P,D))

35 Inversion Elimination (IE)  Proposed by Poole (2003);  Lacked formalization;  Did not identify cases in which it is incorrect;  Formalized and identified correctness conditions (with Dan Roth and Eyal Amir).

36  Requires eliminated RVs to occur in separate instances of parfactor … sick(mary, flu) epidemic(flu) sick(mary, rubella) epidemic(rubella) … sick(mary, D) epidemic(D) D  measles Inversion Elimination correct Inversion Elimination - Limitations

37 IE not correct  Requires eliminated RVs to occur in separate instances of parfactor Inversion Elimination - Limitations epidemic(D2) epidemic(D1) D1  D2 month epidemic(measles) epidemic(flu) epidemic(rubella) … month epidemic(D2) epidemic(D1) D1  D2 month

38 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

39 Counting Elimination  Need to consider joint assignments;  Exponential number of those;  But actually, potential depends on histogram of values in assignment only: the same as 11000;  Polynomial number of assignments instead. epidemic(D2) epidemic(D1) D1  D2 month epidemic(measles) epidemic(flu) epidemic(rubella) … month

40 A Simple Experiment

41 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

42 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

43 BLOG (Bayesian LOGic)  Milch & Russell (2004);  A probabilistic logic language;  Current inference is propositional sampling;  Special characteristics: Open Universe; Expressive language.

44 type Hospital; #Hospital ~ Uniform(1,10); random Boolean Large(Hospital hosp) ~ Bernoulli(0.6); random NaturalNum Region(Hospital hosp) ~ TabularCPD[[0.3, 0.4, 0.2, 0.1]](); type Patient; #Patient(HospitalOf = Hospital hosp) if Large(hosp) then ~ Poisson(1500); else if Region(hosp) = 2 then = 300; else ~ Poisson(500); query Average( {#Patient(b) for Hospital b} ); BLOG (Bayesian LOGic) Open Universe Expressive language

45 Inference in BLOG random Boolean Exposed(hosp) ~ Bernoulli(0.7); random Boolean Sick(Patient patient) if Exposed(HospitalOf(patient)) then if Male(patient) then ~ Bernoulli(0.1); else ~ Bernoulli(0.4); else = False; random Boolean Male(Patient patient) ~ Bernoulli(0.5); random Boolean Medicine(Hospital hosp) = exists Patient patient HospitalOf(patient) = hosp & Sick(patient); guaranteed Hospital hosp13; query Medicine(hosp13);

46 Inference in BLOG - Sampling Medicine(hosp13) #Patient(hosp13) Sick(patient1) Sick(patient73) Exposed(hosp13)... Sampled false Open Universe false

47 Inference in BLOG - Sampling Medicine(hosp13) #Patient(hosp13) Sick(patient1) Exposed(hosp13) Sampled true Open Universe true Male(patient1) true

48 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

49 Temporal models in BLOG  Expressive enough to directly write temporal models type Aircraft; #Aircraft ~ Poisson[3]; random Real Position(Aircraft a, NaturalNum t) if t = 0 then ~ UniformReal[-10, 10]() else ~ Gaussian(Position(a, Pred(t)), 2); type Blip; // num of blips from aircraft a is 0 or 1 #Blip(Source = Aircraft a, Time = NaturalNum t) ~ TabularCPD[[0.2, 0.8]]() // num false alarms has Poisson distribution. #Blip(Time = NaturalNum t) ~ Poisson[2]; random Real BlipPosition(Blip b, NaturalNum t) ~ Gaussian(Position(Source(b), Pred(t)), 2));

50 Temporal models in BLOG  Inference algorithm does not use temporal structure: Markov property: state depends on previous state only; evidence and query come for successive time steps;  Dynamic BLOG (DBLOG);  analogous to Dynamic Bayesian networks (DBNs).

51 DBLOG Syntax  Only change in syntax is a special Timestep type, with same semantics as NaturalNum : type Aircraft; #Aircraft ~ Poisson[3]; random Real Position(Aircraft a, Timestep t) if t then ~ UniformReal[-10, 10]() else ~ Gaussian(Position(a, Prev(t)), 2); type Blip; // num of blips from aircraft a is 0 or 1 #Blip(Source = Aircraft a, Time = Timestep t) ~ TabularCPD[[0.2, 0.8]]() // num false alarms has Poisson distribution. #Blip(Time = Timestep t) ~ Poisson[2]; random Real BlipPosition(Blip b, Timestep t) ~ Gaussian(Position(Source(b), Prev(t)), 2));

52 Similar idea to DBN Particle Filtering DBLOG Particle Filtering evidence at t State samples for t - 1 Samples weighed by evidence resampling State samples for t Likelihood weighting

53 Weighting by evidence in DBNs evidence state t-1t

54 Weighting evidence in DBLOG evidence state #Aircraft t-1t obs {Blip b : Source(b) = a1} = {B1}; obs = 3.2; B1 Lazy instantiation

55 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

56 Retro-instantiation in DBLOG #Aircraft B1 B8 obs {Blip b : Source(b) = a2} = {B8}; obs = 2.1;...

57 Coping with retro-instantiation  Analyse possible queries and evidence (provided by user) and pre-instantiate random variables that will be necessary later;  Use mixing time to decide when to stop sampling back.

58 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

59 Improving BLOG inference  Rejection sampling evidence = ? We count samples that match the evidence.

60 Improving BLOG inference  Likelihood weighting evidence We count samples using evidence likelihood as weight.

61 Improving BLOG inference {Happy(p) : Person p} = {true, true, false} Happy(john) = true Happy(mary) = false Happy(peter) =false Evidence is deterministic function, so likelihood is always 0 or 1. So likelihood weighting is no better than rejection sampling.

62 Improving BLOG inference {Salary(p) : Person p} = {90,000, 75,000, 30,000} Salary(john) = 100,000 Salary(mary) = 80,000 Salary(peter) =500,000 The more unlikely the evidence, the worse it gets.

63 Improving BLOG inference {BlipPosition(b) : Blip b} = {1.2, 0.5, 5.1} BlipPosition(b1) = 2.1 BlipPosition(b2) = 3.3 BlipPosition(b3) = 3.5 Doesn’t work at all for continuous evidence.

64 Structure-dependent automatic importance sampling {BlipPosition(b) : Blip b} = {1.2, 0.5, 5.1} BlipPosition(b1)BlipPosition(b2)BlipPosition(b3) Position(Source(b1)) = 1.5 Position(Source(b2)) = 3.9 Position(Source(b3)) = 4.0 = 1.2= 5.1= 0.5 High-level language allows algorithm to automatically apply better sampling schemes

65 Outline  A goal: First-order Probabilistic Inference An ultimate goal Propositionalization  Lifted Inference Inversion Elimination Counting Elimination  DBLOG BLOG background DBLOG Retro-instantiation Improving BLOG inference  Future Directions and Conclusion

66 For the future  Lifted inference works on symmetric, indistinguishable objects only; Adapt approximations for dealing with similar objects;  Parameterized queries: P(sick(X, measles)) = ?  Function symbols: (diabetes(X), diabetes(father(X)))  Equality (paramodulation);  Learning.

67 Conclusion  Expressive, first-order probabilistic representations in inference too!  Lifted inference equivalent to grounded inference; much faster, yet yielding the same exact answer;  DBLOG brings techniques for temporal processes reasoning to first- order probabilistic reasoning.

68