10/16/02copyright Brian Williams, courtesy of JPL Conflict-directed A* Brian C. Williams J/6.834J October 21, 2002 Brian C. Williams, copyright.

Slides:



Advertisements
Similar presentations
1 Constraint Satisfaction Problems A Quick Overview (based on AIMA book slides)
Advertisements

1 Finite Constraint Domains. 2 u Constraint satisfaction problems (CSP) u A backtracking solver u Node and arc consistency u Bounds consistency u Generalized.
MBD and CSP Meir Kalech Partially based on slides of Jia You and Brian Williams.
Artificial Intelligence Constraint satisfaction problems Fall 2008 professor: Luigi Ceccaroni.
ECE 8443 – Pattern Recognition LECTURE 05: MAXIMUM LIKELIHOOD ESTIMATION Objectives: Discrete Features Maximum Likelihood Resources: D.H.S: Chapter 3 (Part.
Techniques for Dealing with Hard Problems Backtrack: –Systematically enumerates all potential solutions by continually trying to extend a partial solution.
SE Last time: Problem-Solving Problem solving: Goal formulation Problem formulation (states, operators) Search for solution Problem formulation:
Measurements Meir Kalech Partially Based on slides of Brian Williams and Peter struss.
CPSC 322, Lecture 12Slide 1 CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12 (Textbook Chpt ) January, 29, 2010.
Presented by Ed Clarke Slides borrowed from P. Chauhan and C. Bartzis
Constraint Satisfaction Problems
CS121 Heuristic Search Planning CSPs Adversarial Search Probabilistic Reasoning Probabilistic Belief Learning.
Chapter 5 Outline Formal definition of CSP CSP Examples
1 Midterm Review cmsc421 Fall Outline Review the material covered by the midterm Questions?
For Friday Finish chapter 3 Homework: –Chapter 3, exercise 6 –May be done in groups. –Clarification on part d: an “action” must be running the program.
3/11/2002copyright Brian Williams1 Propositional Logic and Satisfiability Brian C. Williams /6.834 October 7 th, 2002.
Chapter 7 Handling Constraints
Informed search algorithms Chapter 4. Outline Best-first search Greedy best-first search A * search Heuristics.
Constraint Satisfaction CPSC 386 Artificial Intelligence Ellen Walker Hiram College.
Hande ÇAKIN IES 503 TERM PROJECT CONSTRAINT SATISFACTION PROBLEMS.
10/16/02copyright Brian Williams, courtesy of JPL Diagnosing Multiple Faults Brian C. Williams J/6.834J October 16 th, 2002 Brian C. Williams,
For Friday Finish reading chapter 4 Homework: –Lisp handout 4.
For Monday Read chapter 4, section 1 No homework..
Review: Tree search Initialize the frontier using the starting state While the frontier is not empty – Choose a frontier node to expand according to search.
Chapter 6 Bayesian Learning
Chapter 5 Constraint Satisfaction Problems
For Friday Read chapter 4, sections 1 and 2 Homework –Chapter 3, exercise 7 –May be done in groups.
Goal-based Problem Solving Goal formation Based upon the current situation and performance measures. Result is moving into a desirable state (goal state).
Informed Search and Heuristics Chapter 3.5~7. Outline Best-first search Greedy best-first search A * search Heuristics.
Searching for Solutions
9/18/2000copyright Brian Williams1 Propositional Logic Brian C. Williams J/6.834J October 10, 2001.
Bayesian Learning Bayes Theorem MAP, ML hypotheses MAP learners
1 Constraint Satisfaction Problems (CSP). Announcements Second Test Wednesday, April 27.
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
Chapter 3 Solving problems by searching. Search We will consider the problem of designing goal-based agents in observable, deterministic, discrete, known.
Lecture 1.31 Criteria for optimal reception of radio signals.
Review: Tree search Initialize the frontier using the starting state
Monitoring Dynamical Systems: Combining Hidden Markov Models and Logic
Integer Programming An integer linear program (ILP) is defined exactly as a linear program except that values of variables in a feasible solution have.
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Model-based Diagnosis: The Single Fault Case
Last time: Problem-Solving
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Instructor: Vincent Conitzer
Ordering of Hypothesis Space
Artificial Intelligence Problem solving by searching CSC 361
Analysis and design of algorithm
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Constraint Propagation
Announcements Homework 3 due today (grace period through Friday)
EA C461 – Artificial Intelligence
Week 4 Jan 29th 2016.
CIS 488/588 Bruce R. Maxim UM-Dearborn
Introduction to Artificial Intelligence Lecture 9: Two-Player Games I
CSE (c) S. Tanimoto, 2001 Search-Introduction
Constraints and Search
Model-based Diagnosis
(1) Breadth-First Search  S Queue S
CSE 473 University of Washington
Constraint satisfaction problems
Constraint Satisfaction Problems. A Quick Overview
Fast SAT through Incremental Unit Propagation
Introducing Underestimates
More advanced aspects of search
Informed Search Idea: be smart about what paths to try.
Tree Searching Strategies
Informed Search Idea: be smart about what paths to try.
Constraint satisfaction problems
Lecture 4: Tree Search Strategies
CS137: Electronic Design Automation
Presentation transcript:

10/16/02copyright Brian Williams, courtesy of JPL Conflict-directed A* Brian C. Williams J/6.834J October 21, 2002 Brian C. Williams, copyright 2000

10/16/02copyright Brian Williams, Outline  Diagnosis in a Nutshell  Diagnosis as an Optimal CSP  Using Conflicts to Find the Best: Conflict-directed A*  Belief Update of Dynamic Systems  Propositional inference in Real Time

10/16/02copyright Brian Williams, Diagnosis identifies consistent modes Adder(i):  G(i): Out(i) = In1(i)+In2(i)  U(i):  Candidate: Assignment to all component modes.  Diagnosis D: Candidate consistent with model Phi and observables OBS M1 M3 A1 A B C D E F G X Y Z Diagnosis = {G(A1) & U(A2) & G(M1) & G(M2) & G(M3)} Given Model and Observations:

10/16/02copyright Brian Williams, Encoding Diagnoses and Inconsistent Candidates Conflict: A set of component modes M that are inconsistent with the model and observations. not M is an implicate of Model & Obs Kernel Diagnosis: A minimal set of component modes K that eliminate all symptoms. M is a prime implicant of Model & Obs Conflicts map to Kernels by minimal set covering M1 M2 A1 A B C D E F G X Y Z M3 A2 ? ? M1 M3 A1 A B C D E F G X Y Z ?

10/16/02copyright Brian Williams, Candidate Probabilities Assume Failure Independence P(x=v|c) estimated using Model:  If previous obs, c and Phi entails x = v Then p(x = v | c) = 1  If previous obs, c and Phi entails x <> v Then p(x = v | c) = 0  If Phi consistent with all values for x Then p(x = v | c) is based on priors  E.g., uniform prior = 1/m for m possible values of x Bayes’ Rule Normalization Term

10/16/02copyright Brian Williams, A large fraction of the diagnoses contain unknown modes U Candidates with UNKNOWN failure modes Candidates with KNOWN failure modes Good G F1 Fn G U But, given failure models, these diagnoses represent a small fraction of the probability density space Most of the density space may be represented by enumerating the few most likely diagnoses

10/16/02copyright Brian Williams, Outline  Diagnosis in a Nutshell  Diagnosis as an Optimal CSP  Using Conflicts to Find the Best: Conflict-directed A*  Belief Update of Dynamic Systems  Propositional inference in Real Time

10/16/02copyright Brian Williams, Diagnosis identifies the most likely consistent modes Given:  System variables X with domain D X  Mode variables Y with domain D Y  System model Phi(X,Y) : D X x D Y  {True, False}  Observations Obs(X) : D X  {True, False} Compute:  Leading Arg Max P(Y | Obs ) Y in D Y s.t. Exists X in D X. Phi(X,Y)  OBS(X) is consistent (i.e., True)

10/16/02copyright Brian Williams, Generalize to Optimal CSP Constraint Satisfaction Problem CSP =  variables X with domain D X  Constraint C(X):D X  {True,False} Find X in D X s.t. C(X) is True Optimal CSP OCSP=  Decision variables Y with domain D Y  Utility function g(Y): D Y    CSP is a CSP on variables Find Leading arg max g(Y) Y in D y s.t. Exists X in D Y. C(X,Y) is consistent

10/16/02copyright Brian Williams, Outline  Diagnosis in a Nutshell  Diagnosis as an Optimal CSP  Using Conflicts to Find the Best  Conflict-directed A*  Review of A*  Enumerating the best kernels and candidates  Belief Update of Dynamic Systems  Propositional inference in Real Time

10/16/02copyright Brian Williams, When you have eliminated the impossible, whatever remains, however improbable, must be the truth. - Sherlock Holmes. The Sign of the Four. Exploring the Improbable

10/16/02copyright Brian Williams, Enumerating Probable Candidates Generate Candidate Test Candidate Consistent? Keep Compute Posterior p Below Threshold? Extract Conflict Done YesNo YesNo Leading Candidate Based on Priors

10/16/02copyright Brian Williams, Conflict-directed A* Given Current Conflicts:  Generate best-cost kernel of conflicts  Extend best kernel to best candidate  Test consistency  Failure: new-conflict.

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 1 Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 1 Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 2 Conflict 1 Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 2 Conflict 1 Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 3 Conflict 2 Conflict 1 Conflict-directed A*

10/16/02copyright Brian Williams, Increasing Cost Feasible Infeasible Conflict 3 Conflict 2 Conflict 1 Conflict-directed A* Feasible regions described by the implicants of known conflicts (Kernel Assignments) Want kernel assignment containing the best cost candidate

10/16/02copyright Brian Williams, Conflict-directed A* Function Conflict-directed-A*(OCSP) returns the leading minimal cost solutions. Conflicts[OCSP]  {} OCSP  Initialize-Best-Kernels(OCSP) Solutions[OCSP]  {} loop do decision-state  Next-Best-State-Resolving-Conflicts(OCSP) if no decision-state returned or Terminate?(OCSP) then return Solutions[OCSP] if Consistent?(CSP[OCSP ], decision-state) then add decision-state to Solutions[OCSP] new-conflicts  Extract-Conflicts(CSP[OCSP], decision-state) Conflicts[OCSP]  Eliminate-Redundant-Conflicts(Conflicts[OCSP]  new-conflicts) end

10/16/02copyright Brian Williams, Conflict-directed A* Function Conflict-directed-A*(OCSP) returns the leading minimal cost solutions. Conflicts[OCSP]  {} OCSP  Initialize-Best-Kernels(OCSP) Solutions[OCSP]  {} loop do decision-state  Next-Best-State-Resolving-Conflicts(OCSP) if no decision-state returned or Terminate?(OCSP) then return Solutions[OCSP] if Consistent?(CSP[OCSP ], decision-state) then add decision-state to Solutions[OCSP] new-conflicts  Extract-Conflicts(CSP[OCSP], decision-state) Conflicts[OCSP]  Eliminate-Redundant-Conflicts(Conflicts[OCSP]  new-conflicts) end

10/16/02copyright Brian Williams, M1 M2 M3 A1 A2 A B C D E F G X Y Z 12 Assume Independent failures:  P G(mi) >> P U(mi)  P single >> P double  P U(M2) > P U(M1) > P U(M3) > P U(A1) > P U(A2) Example: Diagnosis

10/16/02copyright Brian Williams, M1 M2 M3 A1 A2 A B C D E F G X Y Z 12  Conflicts / Constituent Diagnoses  none  Best Kernel:  {}  Best Candidate: ?? First Iteration

10/16/02copyright Brian Williams, { } M1=?  M2=?  M3=?  A1=?  A2=? M1=G  M2=G  M3=G  A1=G  A2=G  Select most likely value for unassigned modes

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z 12

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z 12 6

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z  Extract Conflict and Constituent Diagnoses:

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z  Extract Conflict and Constituent Diagnoses:

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G M1 M2 M3 A1 A2 A B C D E F G X Y Z  Extract Conflict and Constituent Diagnoses: ¬ [M1=G  M2=G  A1=G]

10/16/02copyright Brian Williams,  Test: M1=G  M2=G  M3=G  A1=G  A2=G  Extract Conflict and Constituent Diagnoses: ¬ [M1=G  M2=G  A1=G] M1=U  M2=U  A1=U M1 M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Conflicts / Constituent Diagnoses  M1=U  M2=U  A1=U  Best Kernel:  M1=U  Best Candidate:  M1=G  M2=U  M3=G  A1=G  A2=G Second Iteration M1 M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Test: M1=G  M2=U  M3=G  A1=G  A2=G M1 M3 A2 A B C D E F G X Y Z 12 A1

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z 12 6  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1  Extract Conflict:

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1  Extract Conflict: Not [G(M1) & G(M3) & G(A1) & G(A2)]

10/16/02copyright Brian Williams, M1 M3 A2 A B C D E F G X Y Z  Test: M1=G  M2=U  M3=G  A1=G  A2=G A1  Extract Conflict: ¬ [M1=G  M3=G  A1=G  A2=G] M1=U  M3=U  A1=U  A2=U

10/16/02copyright Brian Williams,  Conflicts / Constituent Diagnoses  M1=U  M2=U  A1=U  M1=U  M3=U  A1=U  A2=U  Best Kernel:  M1=U  Best Candidate:  M1=U  M2=G  M3=G  A1=G  A2=G Second Iteration M1 M3 A2 A B C D E F G X Y Z A1

10/16/02copyright Brian Williams,  Test: M1=U  M2=G  M3=G  A1=G  A2=G M2 M3 A1 A2 A B C D E F G X Y Z 12

10/16/02copyright Brian Williams, M2 M3 A1 A2 A B C D E F G X Y Z 12  Test: M1=U  M2=G  M3=G  A1=G  A2=G

10/16/02copyright Brian Williams, M2 M3 A1 A2 A B C D E F G X Y Z 12 6  Test: M1=U  M2=G  M3=G  A1=G  A2=G

10/16/02copyright Brian Williams, M2 M3 A1 A2 A B C D E F G X Y Z  Test: M1=U  M2=G  M3=G  A1=G  A2=G

10/16/02copyright Brian Williams,  Test: M1=U  M2=G  M3=G  A1=G  A2=G M2 M3 A1 A2 A B C D E F G X Y Z

10/16/02copyright Brian Williams,  Test: M1=U  M2=G  M3=G  A1=G  A2=G M2 M3 A1 A2 A B C D E F G X Y Z Consistent!

10/16/02copyright Brian Williams, A2=UM1=U M3=UA1=U A1=U, A2=U, M1=U, M3=U A1=UM1=UM2=U A1=U M1=U M1=U  A2=U M2=U  M3=U Conflict-Directed A*: Generating Best Kernels A1=U, M1=U, M2=U Constituent Diagnoses Minimal set covering is an instance of breadth first search. To find the best kernel, expand tree in best first order. Insight: Kernels found by minimal set covering

10/16/02copyright Brian Williams, A1=U, A2=U, M1=U, M3=U A1=UM1=UM2=U A2=UM1=U M3=UA1=U M2=U  A2=U M2=U  M3=U M1=U Conflict-Directed A*: Generating Best Kernels A1=U, M1=U, M2=U Constituent Diagnoses A1=U Minimal set covering is an instance of breadth first search. To find the best kernel, expand tree in best first order. Insight: Kernels found by minimal set covering Continue expansion to find best candidate.

10/16/02copyright Brian Williams, Outline  Diagnosis in a Nutshell  Diagnosis as an Optimal CSP  Using Conflicts to Find the Best  Conflict-directed A*  Review of A*  Enumerating the best kernels and candidates  Belief Update of Dynamic Systems  Propositional inference in Real Time

10/16/02copyright Brian Williams, A* Search: Preliminaries Problem: State Space Search Problem   Initial State  Expand(node) Children of Search Node = next states  Goal-Test(node) True if search node at a goal-state hAdmissible Heuristic -Optimistic cost to go Search Node:Node in the search tree  StateState the search is at  ParentParent in search tree ds search node to those to be expanded

10/16/02copyright Brian Williams, A* Search: Preliminaries Problem: State Space Search Problem   Initial State  Expand(node) Children of Search Node = adjacent states  Goal-Test(node) True if search node at a goal-state  NodesSearch Nodes to be expanded  ExpandedSearch Nodes already expanded  InitializeSearch starts at , with no expanded nodes hAdmissible Heuristic -Optimistic cost to go Search Node:Node in the search tree  StateState the search is at  ParentParent in search tree Nodes[Problem]:  Remove-Best(f)Removes best cost node according to f  Enqueue(new-node, f )Adds search node to those to be expanded

10/16/02copyright Brian Williams, A* Search Function A*(problem, h) returns the best solution or failure. Problem pre-initialized. f(x)  g[problem](x) + h(x) loop do if Nodes[problem] is empty then return failure node  Remove-Best(Nodes[problem], f) state  State(node) remove any n from Nodes[problem] such that State(n) = state Expanded[problem]  Expanded[problem]  {state} new-nodes  Expand(node, problem) for each new-node in new-nodes unless State(new-node) is in Expanded[problem] then Nodes[problem]  Enqueue(Nodes[problem], new-node, f ) if Goal-Test[problem] applied to State(node) succeeds then return node end Expand best first

10/16/02copyright Brian Williams, A* Search Function A*(problem, h) returns the best solution or failure. Problem pre-initialized. f(x)  g[problem](x) + h(x) loop do if Nodes[problem] is empty then return failure node  Remove-Best(Nodes[problem], f) state  State(node) remove any n from Nodes[problem] such that State(n) = state Expanded[problem]  Expanded[problem]  {state} new-nodes  Expand(node, problem) for each new-node in new-nodes unless State(new-node) is in Expanded[problem] then Nodes[problem]  Enqueue(Nodes[problem], new-node, f ) if Goal-Test[problem] applied to State(node) succeeds then return node end Terminates when...

10/16/02copyright Brian Williams, A* Search Function A*(problem, h) returns the best solution or failure. Problem pre-initialized. f(x)  g[problem](x) + h(x) loop do if Nodes[problem] is empty then return failure node  Remove-Best(Nodes[problem], f) state  State(node) remove any n from Nodes[problem] such that State(n) = state Expanded[problem]  Expanded[problem]  {state} new-nodes  Expand(node, problem) for each new-node in new-nodes unless State(new-node) is in Expanded[problem] then Nodes[problem]  Enqueue(Nodes[problem], new-node, f ) if Goal-Test[problem] applied to State(node) succeeds then return node end Dynamic Programming Principle...

10/16/02copyright Brian Williams, Outline  Diagnosis in a Nutshell  Diagnosis as an Optimal CSP  Using Conflicts to Find the Best  Conflict-directed A*  Review of A*  Enumerating the best kernels and candidates  Belief Update of Dynamic Systems  Propositional inference in Real Time

10/16/02copyright Brian Williams, Next Best State Resolving Conflicts Function Next-Best-State-Resolving-Conflicts (OCSP) returns the best cost state consistent with Conflicts[OCSP]. f(x)  G[problem] (g[problem](x), h(x)) loop do if Nodes[OCSP] is empty then return failure node  Remove-Best(Nodes[OCSP], f) state  State[node] add state to Visited[OCSP] new-nodes  Expand-State-Resolving-Conflicts(node, OCSP) for each new-node in new-nodes unless Exists n in Nodes[OCSP] such that State[new-node] = state OR State[new-node] is in Visited[problem] then Nodes[OCSP]  Enqueue(Nodes[OCSP], new-node, f ) if Goal-Test-State-Resolving-Conflicts[OCSP] applied to state succeeds then return node end An instance of A*

10/16/02copyright Brian Williams, OCSP Admissible Estimate: Example for {M2=U}  M1=?  M3=?  A1=?  A2=? x P M1=G x P M3=G x P A1=G x P A2=G  Select most likely value for unassigned modes F = G + H P M2=u M2=U

10/16/02copyright Brian Williams, OCSP: Admissible Heuristic  Let g = describe a multi-attribute utility fn  Assume the preference for one attribute x i is independent of another x k  Called Mutual Preferential Independence: For all u, v  Y If g i (u) ≥ g i (v) then for all w G(g i (u),g k (w)) ≥ G(g i (v),g k (w)) An Admissible h:  Given a partial assignment, to X  Y  h selects the best value of each unassigned variable Z = X – Y h(Y) = G({g zi_max | z i  Z, max g zi (v ij ))}) v ij  D zi  A candidate always exists satisfying h(Y).

10/16/02copyright Brian Williams, Termination and Expansion Function Goal-Test-State-Resolving-Conflicts (node, problem) returns True IFF node is a complete decision state. if forall K in Constituent-Kernels(Conflicts[problem]), State[node] contains a kernel in K then if all variables are assigned in State[node] then return True else return False Function Expand-State-Resolving-Conflicts (node, problem) returns best nodes expanded from node. if forall K in Constituent-Kernels(Conflicts[problem]), State[node] contains a kernel in K then if all variables are assigned in State[node] then return {} else return Expand-Variable(node, problem) else return Expand-Conflict(node, problem)

10/16/02copyright Brian Williams, Conflict-directed A*: Expanding Children Unresolved Conflicts:All Conflicts Resolved: M2=UM1=UA1=UA2=GA2=U  (M2=G  M1=G  A1=G) Conflict { }  Select unresolved Conflict.  Each child adds a constituent kernel.  Select unassigned variable y i.  Each child adds an assignment from D i.

10/16/02copyright Brian Williams, M2=UA1=UM1=U M2=U  M1=U  A1=U { } For any Node N:  The child of N containing the best cost candidate is the child with the best estimated cost f = g+h (by MPI).  Only need to expand the best child. Conflict-directed A*: Expand Best Child & Sibling Constituent kernels

10/16/02copyright Brian Williams, M2=UA1=UM1=U Order constituents by decreasing likelihood { } >> Conflict-directed A*: Expand Best Child & Sibling For any Node N:  The child of N containing the best cost candidate is the child with the best estimated cost f = g+h.  Only need to expand the best child. M2=U  M1=U  A1=U Constituent kernels

10/16/02copyright Brian Williams, M2=U Order constituents by decreasing likelihood { } Conflict-directed A*: Expand Best Child & Sibling For any Node N:  The child of N containing the best cost candidate is the child with the best estimated cost f = g+h.  Only need to expand the best child. M2=U  M1=U  A1=U Constituent kernels

10/16/02copyright Brian Williams, M1=U  M3=U  A1=U  A2=U M1=U M2=U M1=U Conflict-directed A*: Expand Best Child & Sibling M2=U  M1=U  A1=U Constituent kernels  When a best child loses any candidate, expand child’s next best sibling:  If unresolved conflicts: expand as soon as next conflict of child is resolved.  If conflicts resolved: expand as soon as child is expanded to a full candidate. { } M2=GM3=GA1=GA2=G M2=UM3=UA1=U A2=U Figure 33 [Williams& Ragno,2002] Figure 24 [Williams & Ragno, 2002]

10/16/02copyright Brian Williams, Problem Parameters Constraint-based A* (no conflicts) Conflict-directed A*Mean CD-CB Ratio Dom Size Dec Vars Clau -ses Clau -se lngth Nodes Expand ed Queue Size Nodes Expand Queue Size Conflicts used Nodes Expanded Queue Size , % 5.6% ,3603, % 3.5% ,2706, % 1.1% 10 63,79013, % 1.0% ,4305, % 5.8% , % 3.9% % % 5.4% %11.0%

10/16/02copyright Brian Williams, Research in Model-based Diagnosis Methods exist for: Focusing on likely diagnoses Active probing Diagnosing dynamic systems and monitoring behavior Repairing and compensating for failures Diagnosing hybrid discrete/continuous systems Performing distributed diagnosis