Download presentation
Presentation is loading. Please wait.
Published byErik Robertson Modified over 9 years ago
1
Concurrent Reasoning with Inference Graphs Daniel R. Schlegel and Stuart C. Shapiro Department of Computer Science and Engineering University at Buffalo, The State University of New York Buffalo, New York, USA @buffalo.edu D. R. Schlegel and S. C. Shapiro1
2
Problem Statement Rise of multi-core computers Lack of concurrent natural deduction systems Application to natural language understanding for terrorist plot detection. D. R. Schlegel2 A Motivation
3
What are Inference Graphs? Graphs for natural deduction – Four types of inference: Forward Backward Bi-directional Focused – Retain derived formulas for later re-use. – Propagate disbelief. – Built upon Propositional Graphs. Take advantage of multi-core computers – Concurrency and scheduling – Near-linear speedup. D. R. Schlegel3
4
Propositional Graphs Directed acyclic graph Every well-formed expression is a node – Individual constants – Functional terms – Atomic formulas – Non-atomic formulas (“rules”) Each node has an identifier, either – Symbol, or – wfti[!] No two nodes with same identifier. D. R. Schlegel and S. C. Shapiro4
5
Propositional Graphs D. R. Schlegel and S. C. Shapiro5 If a, and b are true, then c is true. a c and-ant cq wft1! b and-ant
6
Inference Graphs Extend Propositional Graphs Add channels for information flow (messages): – i-channels report truth of an antecedent to a rule node. – u-channels report truth of a consequent from a rule node. Channels contain valves. – Hold messages back, or allow them through. D. R. Schlegel and S. C. Shapiro6 i-channel u-channel a c and-ant cq wft1! b and-ant
7
Messages Five kinds – I-INFER – “I’ve been inferred” – U-INFER – “You’ve been inferred” – BACKWARD-INFER – “Open valves so I might be inferred” – CANCEL-INFER – “Stop trying to infer me (close valves)” – UNASSERT – “I’m no longer believed” D. R. Schlegel and S. C. Shapiro7
8
Priorities Messages have priorities. – UNASSERT is top priority – CANCEL-INFER is next – I-INFER/U-INFER are higher priority closer to a result – BACKWARD-INFER is lowest D. R. Schlegel and S. C. Shapiro8
9
Rule Node Inference 1.Message arrives at node. D. R. Schlegel and S. C. Shapiro9 a! c Assume we have a KB with a ^ b -> c, and b. Then a is asserted with forward inference. A message is sent from a to wft1 and-ant cq a : true wft1! i-infer b! and-ant
10
Rule Node Inference 2. Message is translated to Rule Use Information D. R. Schlegel and S. C. Shapiro10 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents a : true Rule Use Information is stored in rule nodes to be combined later with others that arrive. a! c and-ant cq wft1! b! and-ant
11
Rule Node Inference 3. Combine RUIs with any existing ones D. R. Schlegel and S. C. Shapiro11 1 Positive Antecedent, b 0 Negative Antecedents 2 Total Antecedents Combine the RUI for a with the one which already exists in wft1 for b. 1 Positive Antecedent, a 0 Negative Antecedents 2 Total Antecedents + 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents = a! c and-ant cq wft1! b! and-ant
12
Rule Node Inference 4. Determine if the rule can fire. D. R. Schlegel and S. C. Shapiro12 We have two positive antecedents, and we need two. The rule can fire. 2 Positive Antecedents, a,b 0 Negative Antecedents 2 Total Antecedents a! c and-ant cq wft1! b! and-ant
13
Rule Node Inference 5. Send out new messages. D. R. Schlegel13 c will receive the message, and assert itself. c : true u-infer a! c and-ant cq wft1! b! and-ant
14
Cycles Graphs may contain cycles. No rule node will infer on the same message more than once. – RUIs with no new information are ignored. Already open valves can’t be opened again. D. R. Schlegel and S. C. Shapiro14 a b ant cq wft2! wft1! ant cq
15
Concurrency and Scheduling Inference Segment: the area between two valves. When messages reach a valve: – A task is created with the same priority as the message. Task: application of the segment’s function to the message. – Task is added to a prioritized queue. Tasks have minimal shared state, easing concurrency. D. R. Schlegel and S. C. Shapiro15
16
Concurrency and Scheduling A task only operates within a single segment. 1.tasks for relaying newly derived information using segments “later” in the derivation are executed before “earlier” ones, and 2.once a node is known to be true or false, all tasks attempting to derive it are canceled, as long as their results are not needed elsewhere. D. R. Schlegel and S. C. Shapiro16
17
Example D. R. Schlegel and S. C. Shapiro17 cq Backchain on cq. Assume every node requires a single one of its incoming nodes to be true for it to be true (simplified for easy viewing). Two processors will be used.
18
Example D. R. Schlegel and S. C. Shapiro18 Backward Inference (Open valve) Inferring Inferred Cancelled cq
19
Example D. R. Schlegel and S. C. Shapiro19 Backward Inference (Open valve) Inferring Inferred Cancelled cq
20
Example D. R. Schlegel and S. C. Shapiro20 Backward Inference (Open valve) Inferring Inferred Cancelled cq
21
Example D. R. Schlegel and S. C. Shapiro21 Backward Inference (Open valve) Inferring Inferred Cancelled cq
22
Example D. R. Schlegel and S. C. Shapiro22 Backward Inference (Open valve) Inferring Inferred Cancelled cq
23
Example D. R. Schlegel and S. C. Shapiro23 Backward Inference (Open valve) Inferring Inferred Cancelled cq
24
Example D. R. Schlegel and S. C. Shapiro24 Backward Inference (Open valve) Inferring Inferred Cancelled cq
25
Example D. R. Schlegel and S. C. Shapiro25 Backward Inference (Open valve) Inferring Inferred Cancelled cq
26
Example D. R. Schlegel and S. C. Shapiro26 Backward Inference (Open valve) Inferring Inferred Cancelled cq
27
Example D. R. Schlegel and S. C. Shapiro27 Backward Inference (Open valve) Inferring Inferred Cancelled cq
28
Example D. R. Schlegel and S. C. Shapiro28 Backward Inference (Open valve) Inferring Inferred Cancelled cq
29
Example D. R. Schlegel and S. C. Shapiro29 Backward Inference (Open valve) Inferring Inferred Cancelled cq
30
Example D. R. Schlegel and S. C. Shapiro30 Backward Inference (Open valve) Inferring Inferred Cancelled cq
31
Example D. R. Schlegel and S. C. Shapiro31 Backward Inference (Open valve) Inferring Inferred Cancelled cq
32
Example D. R. Schlegel and S. C. Shapiro32 Backward Inference (Open valve) Inferring Inferred Cancelled cq
33
Example D. R. Schlegel and S. C. Shapiro33 Backward Inference (Open valve) Inferring Inferred Cancelled cq
34
Example D. R. Schlegel and S. C. Shapiro34 Backward Inference (Open valve) Inferring Inferred Cancelled cq
35
Example D. R. Schlegel and S. C. Shapiro35 Backward Inference (Open valve) Inferring Inferred Cancelled cq
36
Evaluation Concurrency: – Near linear performance improvement with the number of processors – Performance robust to graph depth and branching factor changes. Scheduling Heuristics: – Backward-inference with or-entailment shows 10x improvement over LIFO queues, and 20-40x over FIFO queues. D. R. Schlegel and S. C. Shapiro36
37
Acknowledgements This work has been supported by a Multidisciplinary University Research Initiative (MURI) grant (Number W911NF-09- 1-0392) for Unified Research on Network-based Hard/Soft Information Fusion, issued by the US Army Research Office (ARO) under the program management of Dr. John Lavery. D. R. Schlegel37
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.