Intelligent Diagnosis Systems Meir Kalech. Course Outline 1. Intelligent diagnosis systems 2. Model-based diagnosis: basics and definitions 3. Resolution.

Slides:



Advertisements
Similar presentations
Lecture 2: CBR Case Retrieval
Advertisements

Heuristic Search techniques
Bayesian Network and Influence Diagram A Guide to Construction And Analysis.
CHAPTER 13 Inference Techniques. Reasoning in Artificial Intelligence n Knowledge must be processed (reasoned with) n Computer program accesses knowledge.
Rulebase Expert System and Uncertainty. Rule-based ES Rules as a knowledge representation technique Type of rules :- relation, recommendation, directive,
CS 484 – Artificial Intelligence1 Announcements Choose Research Topic by today Project 1 is due Thursday, October 11 Midterm is Thursday, October 18 Book.
Inferences The Reasoning Power of Expert Systems.
Expert System Shells - Examples
ICT IGCSE Expert Systems.
Intelligent systems Lecture 6 Rules, Semantic nets.
Rule Based Systems Michael J. Watts
Rule Based Systems Alford Academy Business Education and Computing
© C. Kemke1Reasoning - Introduction COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Chapter 11 Artificial Intelligence and Expert Systems.
Introduction to Expert Systems
Measurements Meir Kalech Partially Based on slides of Brian Williams and Peter struss.
Artificial Intelligence
1 5.0 Expert Systems Outline 5.1 Introduction 5.2 Rules for Knowledge Representation 5.3 Types of rules 5.4 Rule-based systems 5.5 Reasoning approaches.
Lecture 04 Rule Representation
1 Learning Entity Specific Models Stefan Niculescu Carnegie Mellon University November, 2003.
1 Chapter 9 Rules and Expert Systems. 2 Chapter 9 Contents (1) l Rules for Knowledge Representation l Rule Based Production Systems l Forward Chaining.
Rules and Expert Systems
© C. Kemke1Reasoning - Introduction COMP 4200: Expert Systems Dr. Christel Kemke Department of Computer Science University of Manitoba.
Data Mining with Decision Trees Lutz Hamel Dept. of Computer Science and Statistics University of Rhode Island.
Production Rules Rule-Based Systems. 2 Production Rules Specify what you should do or what you could conclude in different situations. Specify what you.
Marakas: Decision Support Systems, 2nd Edition © 2003, Prentice-Hall Chapter Chapter 7: Expert Systems and Artificial Intelligence Decision Support.
EXPERT SYSTEMS Part I.
Case-Based Reasoning Ramon López de Mántaras Badia IIIA - CSIC
Bahar Qarabaqi Azar 19 th, FC Inferencing Initial information about the problem being asserted into working memory. Database Sensors User.
Building Knowledge-Driven DSS and Mining Data
Artificial Intelligence CSC 361
Introduction to Model- Based Diagnosis Meir Kalech Partially based on the slides of Peter Struss.
Sepandar Sepehr McMaster University November 2008
Expert Systems Infsy 540 Dr. Ocker. Expert Systems n computer systems which try to mimic human expertise n produce a decision that does not require judgment.
Artificial Intelligence Lecture No. 15 Dr. Asad Ali Safi ​ Assistant Professor, Department of Computer Science, COMSATS Institute of Information Technology.
Chapter 14: Artificial Intelligence Invitation to Computer Science, C++ Version, Third Edition.
13: Inference Techniques
Course Instructor: K ashif I hsan 1. Chapter # 2 Kashif Ihsan, Lecturer CS, MIHE2.
Case-based Reasoning A type of analogical reasoning
School of Computer Science and Technology, Tianjin University
Knowledge based Humans use heuristics a great deal in their problem solving. Of course, if the heuristic does fail, it is necessary for the problem solver.
Decision Trees & the Iterative Dichotomiser 3 (ID3) Algorithm David Ramos CS 157B, Section 1 May 4, 2006.
CS 682, AI:Case-Based Reasoning, Prof. Cindy Marling1 Chapter 3: Reasoning Using Cases In this chapter, we look at how cases are used to reason We’ve already.
Chapter 9: Rules and Expert Systems Lora Streeter.
 Architecture and Description Of Module Architecture and Description Of Module  KNOWLEDGE BASE KNOWLEDGE BASE  PRODUCTION RULES PRODUCTION RULES 
1 CHAPTER 13 Decision Support Systems and Intelligent Systems, Efraim Turban and Jay E. Aronson 6th ed, Copyright 2001, Prentice Hall, Upper Saddle River,
Reasoning Under Uncertainty: Conditioning, Bayes Rule & the Chain Rule Jim Little Uncertainty 2 Nov 3, 2014 Textbook §6.1.3.
I Robot.
Expert System Note: Some slides and/or pictures are adapted from Lecture slides / Books of Dr Zafar Alvi. Text Book - Aritificial Intelligence Illuminated.
1 Computer Group Engineering Department University of Science and Culture S. H. Davarpanah
Uncertainty Management in Rule-based Expert Systems
Knowledge Learning by Using Case Based Reasoning (CBR)
1 Knowledge Acquisition and Learning by Experience – The Role of Case-Specific Knowledge Knowledge modeling and acquisition Learning by experience Framework.
AI in Knowledge Management Professor Robin Burke CSC 594.
Abdul-Rahman Elshafei – ID  Introduction  SLAT & iSTAT  Multiplet Scoring  Matching Passing Tests  Matching Complex Failures  Multiplet.
Expert Systems. Learning Objectives: By the end of this topic you should be able to: explain what is meant by an expert system describe the components.
Chapter 9. Rules and Expert Systems Fall 2013 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
1 An infrastructure for context-awareness based on first order logic 송지수 ISI LAB.
Artificial Intelligence
Forward and Backward Chaining
Expert System Seyed Hashem Davarpanah University of Science and Culture.
Assumption-based Truth Maintenance Systems: Motivation n Problem solvers need to explore multiple contexts at the same time, instead of a single one (the.
Artificial Intelligence: Applications
Network Management Lecture 13. MACHINE LEARNING TECHNIQUES 2 Dr. Atiq Ahmed Université de Balouchistan.
Intelligent Systems Rule based systems 1. Aims of session To understand Basic principles Forward chaining Problems with rule-based systems Backward chaining.
3.3. Case-Based Reasoning (CBR)
Chapter 9. Rules and Expert Systems
Architecture Components
Where did we stop? The Bayes decision rule guarantees an optimal classification… … But it requires the knowledge of P(ci|x) (or p(x|ci) and P(ci)) We.
Chapter 9. Rules and Expert Systems
Presentation transcript:

Intelligent Diagnosis Systems Meir Kalech

Course Outline 1. Intelligent diagnosis systems 2. Model-based diagnosis: basics and definitions 3. Resolution theorem prover 4. Diagnosis of multiple faults: general diagnosis engine GDE 5. Assumption-based truth maintenance system 6. Measurements to differentiating diagnosis

Course Outline cont. 7. Diagnosis with behavior modes 8. Self-Configuring Systems 9. Model-based diagnosis as a constraints satisfaction problem 10. Diagnosis of Discrete Event Systems 11. Diagnosis of distributed systems 12. Diagnosis of multi-agent systems

Today Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

What is a diagnosis? Diagnosis system identifies the reason for a problem by examining observed symptoms. Requirement: a knowledge of the diagnosed system Given by experts Learned by AI techniques Examples for diagnosis domains: Disease diagnosis Identification of software and hardware problems Troubleshooting of electrical and mechanical systems Fault detection and diagnosis of planning

Example Doesn’ t start!!! 1. Ignition 2. battery 3. both

Example What should I do? ¬battery  ¬headlights Check headlights

Example  Components: Battery Bulbs (headlight) Wiper motor Ignition  Possible observation: Headlights work/don’t Engine starts/doesn’t Wipers work/don’t

Diagnosis objectives  How to infer consequent diagnosis from observation?  How to model the relationship between symptoms and diagnosis?  How to diagnose faults from the model?

Different approaches These objectives are differently treated by different approaches, for instance:  Model-based diagnosis must have precise model of the system (i.e. car).  Expert systems treat incomplete model (medical diagnosis).  Case-based reasoning represents the knowledge in a repository of cases.

Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

Expert Systems  Knowledge is represented via rules  Rules express what happens when certain conditions met: If….Then statements Implications A → B  Rules tell expert system what to do in certain circumstances

Knowledge Representation  Recommendations Take set of inputs and output advice Example: If alarm AND smoke then tell people go outside  Directives Take set of inputs and output direct action Example:If smoke AND fire then go outside  Relations Take set of inputs and output information Example: If temperature below 32° then weather is cold

Rule-based Systems  Rule-based/Production system Use rules to provide recommendations/diagnoses Use rules to determine course of action Use rules to solve a particular problem  Consists of: Knowledge base – database of rules Database of facts Interpreter/ inference engine

16 Architecture

17 Elements of an Expert System  User interface – mechanism by which user and system communicate.  Exploration facility – explains reasoning of expert system to user.  Working memory – global database of facts used by rules.  Inference engine – makes inferences deciding which rules are satisfied and prioritizing.

Inference engine Conclusions derived using deduction Forward chaining – using deduction from set of antecedents Backward chaining – starts with conclusion and works towards logical set of antecedents

Forward Chaining  Data-driven reasoning Starts with data set and moves towards conclusion When all antecedents matched, rule is triggered and conclusion added to facts database  Conflict resolution Occurs when more than one conclusion deduced from facts

Conflict resolution: 1. Priority resolution  Each rule given a priority level  Rule with highest priority triggered  Example: IF patient has pain THEN prescribe painkillers (priority 10) If patient has chest pain THEN treat for heart disease (priority 100)

Conflict resolution: 2. Longest-matching strategy  Conclusion deduced from longest rule triggered  Example: If patient has pain THEN prescribe painkillers If patient has chest pain AND patient is over 60 AND patient has history of heart conditions THEN take to emergency room

Conflict resolution: 3. Most recent match  Rule that matches facts most recently added to database fired  Example: If patient has pain THEN prescribe aspirin (entered 10/17/1975) If patient has pain THEN prescribe acetomenophin (entered 11/1/2000)

Backward Chaining Goal-driven reasoning Starts with a conclusion/hypothesis and moves to show hypothesis can be reached from rules and facts in database Used in formulating plans

FC vs BC  Forward chaining appropriate when: Set of facts available but conclusion unknown Many possible conclusions  Backward chaining appropriate when: Few possible conclusions but many possible facts Possible facts consist many not relevant to conclusion

FC vs BC Example  Rules: 1.A ^ B → C 2.A → D 3.C ^ D → E 4.B ^ E ^ F → G 5.A ^ E → H 6.D ^ E ^ H → I  Facts: 1.A 2.B 3.F  Goal = H Forward Chaining Facts Rules triggered Rules fired A,B,F 1,2 1 A,B,C,F 2 2 A,B,C,D,F 3 3 A,B,C,D,E,F 4,5 4 A,B,C,D,E,F,G 5 5 A,B,C,D,E,F,G,H Goal Reached!

FC vs BC Example  Rules: 1.A ^ B → C 2.A → D 3.C ^ D → E 4.B ^ E ^ F → G 5.A ^ E → H 6.D ^ E ^ H → I  Facts: 1.A 2.B 3.F  Goal = H Backward Chaining Facts Goals Matching Rules A,B,F H 5 A,B,F E 3 A,B,F C,D 1 A,B,C,F D 2 A,B,C,D,F STOP!

Example: rule-based diagnosis

Forward chaining 1.Headlights don’t work 2.Faulty bulbs and/or battery 3.Faulty battery Working memory:Rules fire: 1 2 No rule matches Rule 2 is close: Querying the user: Status of engine Response: Engine doesn’t start No further matched rules  faulty battery

Problem!!! 1.Headlights don’t work 2.Faulty bulbs and/or battery 3.Faulty battery 4.Wipers work Working memory:Rules fire: 1212 No further matched rules  faulty battery faulty ignition AND faulty bulbs Is not covered by the knowledge system BUT! By adding this fact…

The Limitations of Rules  The success of rule-based expert systems is due to several factors: They can mimic some human problem-solving strategies Rules are a part of everyday life, so people can relate to them  However, a significant limitation is the knowledge elicitation bottleneck Experts may be unable to articulate their expertise Heuristic knowledge is particularly difficult Experts may be too busy…

Challenges  The knowledge must be covered by the experts.  Representing the knowledge to efficient rules.  Chose the appropriate inference mechanism.  Diagnosing multiple faulty components.

Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

Model-based systems  Expert systems – complete model is unavailable (medical diagnosis)  Model-based systems – the physical principles are largely known (automobile diagnosis)  Formulating rules to capture the causal functional knowledge.  Describing the properly working components. For instance: if ok(battery) AND ok(ignition) THEN start(engine)  If engine doesn’t start then infer by truth maintenance system (TMS) that battery or ignition are faulty.

Scenario – example for ATMS reasoning  A goes to a birthday party of B.  B is a woman.  All women love flowers.  A decides to give flowers to B.  C tells A that B is allergic to flowers.  A buys to B a gum.

Some logic…  C = common knowledge (facts & rules)  A = assumptions  D = C  A (DataBase)  Explanation for p: minimal E' ⊆ A: E'  C ⊨ p  Suppose a new fact q. since D contains assumptions that may contradict q, we should identify them.  We find all explanation for ¬q in D: {E' 1,E' 2 …E' n }.  Then we find a minimal set H such that: ∀ E' i ∈ {E' 1,E' 2 …E' n }: H ∩ E' i != {} (hitting set).  H: the minimal set of assumption that contradict q.

Example for MBD

 Observations (facts): ¬works(headlights) ∧ ¬starts(engine) ∧ works(wipers) we should find explanations to works(headlights), and starts(engine).  Explanations: E’ 1 (works(headlights))={ok(battery),ok(bulbs)} E’ 2 (starts(engine))={ok(battery),ok(ignition)}  Diagnosis: H={ok(battery)} H={ok(bulbs),ok(ignition)} Example for MBD  ¬works(wipers)

 It is not always feasible to build an accurate model  Inferring the diagnosis is computationally intractable in complex systems.  Improvements: Less accurate model Less accurate diagnosis MBD - drawbacks

Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

Another Way We Solve Problems?  By remembering how we solved a similar problem in the past  This is Case Based Reasoning (CBR) memory-based problem-solving re-using past experiences  Experts often find it easier to relate stories about past cases rather than to formulate rules

Databases  Database technology would seem ideally suited to the task of retrieving known solutions to problems  Databases are excellent at finding exact matches…  But are poor at near or fuzzy matches

The CBR CycleSolution Review Retain Adapt Retrieve Similar New Problem

R 4 Cycle  Retrieve the cases from the case-base whose problem is most similar to the new problem.  Reuse the solutions from the retrieved cases to create a proposed solution for the new problem.  Revise the proposed solution to take account of the problem differences between the new problem and the problems in the retrieved cases.  Retain the new problem and its revised solution as a new case for the case-base if appropriate.

CBR Assumption(s)  The main assumption is that: Similar problems have similar solutions:  e.g., an aspirin can be taken for any mild pain  Two other assumptions: The world is a regular place: what holds true today will probably hold true tomorrow  (e.g., if you have a headache, you take aspirin, because it has always helped) Situations repeat: if they do not, there is no point in remembering them  (e.g., it helps to remember how you found a parking space near that restaurant)

Problems We Solve This Way  Medicine doctor remembers previous patients, especially for rare combinations of symptoms  Law English/US law depends on precedence case histories are consulted  Management decisions are often based on past rulings  Financial performance is predicted by past results

Good / Bad Applications for CBR  Classification tasks (good for CBR) Diagnosis - what type of fault is this? Prediction / estimation - what happened when we saw this pattern before?  Synthesis tasks (harder for CBR) Engineering Design Planning Scheduling

Retrieval by indices Scoring function: DB of 3 cases:

Retrieval by indices Degree of match is the sum of score: New DB:

Car Diagnosis Example  Symptoms are observed Engine does not start Battery voltage = 7v  Goal Cause of failure: flat battery Repair strategy: charge battery

Car Diagnosis Case  Each case describes one diagnostic situation Described by a list of features Contains a list of feature values Problem  Symptom: headlight does not work  Car: Ford Mondeo  Year: 2001 Solution  Diagnosis: headlight fuse blown  Repair: replace headlight fuse  Battery: 10.4v  Headlights: undamaged  HeadlightSwitch: on FeatureValue Case 1

Case 2 Case 1 Car Diagnosis Case-Base  A collection of independent cases Problem  Symptom: headlight does not work  Car: Ford Mondeo  Year: 2001 Solution  Diagnosis: headlight fuse blown  Repair: replace headlight fuse Problem  Symptom: headlight does not work  Car: Ford Ka  Year: 2003 Solution  Diagnosis: defective bulb  Repair: replace headlight  Battery: 10.4v  Headlights: undamaged  HeadlightSwitch: on  Battery: 9.5v  Headlights: surface damage  HeadlightSwitch: on

Case Representation  Depends on problem domain  Flat structure A list of feature values (car diagnosis example)  Easy to store and retrieve  Specialised representations Graphs - nodes and arcs Plans - partially ordered set of actions Object-oriented - objects (instances of classes)  More difficult to store and retrieve

Case Representation  Object-oriented representation: A case is a set of objects An object is described by a set of features  Classes are arranged in a hierarchy Relations between objects  (e.g. part-of) Combine similarities of parts Car BrakesEngine Transmission Ignition SystemFuel Injection CoilSpark Plug Colour: dark grey Gap: 1.2mm

 A new problem is a case without a solution part  Not all problem features must be known same for cases  Problem  Symptom: brakelight does not work  Car: Ford Fiesta  Year: 1997 New Car Diagnosis Problem  Battery: 9.2v  Headlights: undamaged  HeadlightSwitch: ? Feature Value New

Calculating Feature Similarity  Distances between values of individual features problem and case have values p and c for feature f Distance for Numeric features  d f (problem,case) = |p - c|/(max difference) Distance for Symbolic features  d f (problem,case)= 0 if p = c 1 otherwise  Similarity f (problem,case) = 1 - d  Degree of similarity is between 0 and 1

Calculating Case Similarity Similarity(problem,case) = weighted sum of Similarity f (problem,case) for all features f High importance features have large weight  symptom, battery, headlights  weight = 6 Low importance features have low weight  car, year  weight = 1 Case similarity =  s i is similarity of i th feature  w i is weight of i th feature w 1 *s 1 + w 2 * s 2 + …… + w n *s n w 1 + w 2 + …… + w n

New Problem and Case 1 New Problem  Symptom: brakelight does not work  Car: Ford Fiesta  Year: 1997  Battery: 9.2v  Headlights: undamaged  HeadlightSwitch: ? weight = 61 Problem  Symptom: headlight does not work  Car: Ford Mondeo  Year: 2001  Battery: 10.4v  Headlights: undamaged  HeadlightSwitch: on Solution  Diagnosis: headlight fuse blown  Repair: replace headlight fuse  Similarity(New, Case 1) = 6* * * * * = 17.4 / 20 = 0.87 Similarity Case 1

New Problem  Symptom: brakelight does not work  Car: Ford Fiesta  Year: 1997  Battery: 9.2v  Headlights: undamaged  HeadlightSwitch: ? weight = 61 Problem  Symptom: headlight does not work  Car: Ford Ka  Year: 2003  Battery: 9.5v  Headlights: surface damage  HeadlightSwitch: on Solution  Diagnosis: defective bulb  Repair: replace headlight  Similarity(New, Case 2) = 6* * * * * = / 20 = 0.59 Case 2 Similarity New Problem and Case 2

Reuse Solution from Case 1 New Problem  Symptom: brakelight does not work  Car: Ford Fiesta  Year: 1997  Battery: 9.2v  Headlights: undamaged  HeadlightSwitch: ? Case 1 Problem  Symptom: headlight does not work  … Solution  Diagnosis: headlight fuse blown  Repair: replace headlight fuse Solution to New Problem  Diagnosis: headlight fuse blown  Repair: replace headlight fuse After Adaptation  Diagnosis: brakelight fuse blown  Repair: replace brakelight fuse

Characteristics and drawbacks  Characteristics 1. No accurate model 2. No rule-based representation, But: repository of cases.  Drawbacks: 1. High computational cost of:  Retrieval.  DB organization. 2. Not guarantee to provide complete diagnosis

Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

Inductive learning systems  Previous approaches try to model the system  Diagnosis inference based on the model  In inductive learning approach the model is learned from examples.  Diagnosis inference based on the relationships between symptoms and diagnoses.  The system induces an appropriate set of classification rules.

Formally: (I k,C k ) In the domain of diagnosis: Challenge: to learn an accurate description of a class from a representative set of examples. C k ∈ C (C: set of possible classes) I k : instance like vector of attributes I k : observed symptoms C k ∈ C (C: set of possible diagnosis)

Training set In this representation multiple faults are not addressed For some cases multiple faults could be addressed by Sequentially isolating single faults

Multiple fault - training set In this representation we need much larger training set

Generalization  Generalization is the ability of the inductive learning system to classify new instance.  Factors: Size of training examples. The distribution of the training examples. The representation of the instances and classes.

Decision tree (ID3)  ID3 algorithm provides a way to learn classification rules represented by decision tree.  Leaf nodes: represent the classes (diagnoses).  Internal nodes: represent the attributes (symptoms).

ID3 algorithm  The goal of ID3 is to generate a decision tree as small as possible.  This will require fewer attributes tests.  Recursively select attributes that yield the maximum information gain.

ID3 algorithm n 1 and n 2 are the number of examples in the training set of class 1 and class 2 (should be generalized to n 3 …) The expected information from attribute A to be the root: Where are the number of examples of the training set belong to class 1 and class 2 and have a value of A i.

ID3 algorithm The information gained by testing the value of attribute A is: Recursively, at each step of the tree, ID3 computes the information gain of the untested attributes and chooses the attribute with the maximum information gain.

Example: ID3 algorithm (based on TS in page 59)  3 attributes: Engine, Headlights, Wipers  4 classes: 1. Faulty Bulbs 2. Faulty Battery 3. Faulty Wiper motor 4. Faulty Ignition  The distribution of the examples are (1,1,2,2)  Equation 1:

Example: ID3 algorithm (based on TS in page 59)  Equation 2:  Equation 3:

Example: ID3 algorithm (based on TS in page 59)  Much compact tree than the one in page 61.  We can construct rules for expert system.

Outline 1. What is a diagnosis? 2. Expert systems 3. Model-based systems 4. Case Based Reasoning (CBR) 5. Inductive learning 6. Probabilistic reasoning

Probabilistic Reasoning  Once we do not know the truth value of an attribute (symptom) for certain, we can model its uncertainty.  The training set uses for getting the distributions of the data.  The chosen class (fault) is the most likely given the attribute values.

Probabilistic Reasoning - Notation  X={X 1, X 2 …X n } : set of observations (symptoms) variables Example: Headlights, engine and wipers.  Each of X i takes a value from: X={x 1,x 2,…,x n } Example: X 2 - engine takes the value “starts”.  P(X): the probability that the random variables in X take the values X.  P(x i ): the probability that the random variable X i takes the value x i.  {C 1, C 1,…, C m }: set of classes (possible faults). Example: Faulty bulbs, faulty battery, faulty motor, faulty ignition.

Model  Given a particular input X, by using Bayes rule we can calculate the likelihood of each class C i : where:  Now, the classification of X can be performed by choosing the class C k that maximize the likelihood: (6) (7)

Model  For the random variables X 1, X 2,X 3 that observed taken the values x 1,x 2,x 3, Eq.6 can be represented:  We should calculate the conditional and unconditional probabilities:  For large number of attributes it is intractable. (8)

Model  Assumption: conditional independence of the observations (symptoms):  Bayes rule for m classes involving 3 symptoms is: (9)

Example – based on TS in page 59  Based on the training set we can infer (P(C i )):  For Eq.9 we need to calculate the conditional probabilities of each one of the symptoms.  For instance: Given faulty bulbs, the probability that headlights will not work is 1: P(¬H|¬Ba) P(¬E|¬Ba): since there is no causal relationship between engine and Bulbs we can say that: P(¬E|¬Ba)=P(¬E)=3/6=0.5

Example – based on TS in page 59 Given a certain input: wipers work (W), headlights don’t work (¬H), engine doesn’t start (¬E), the probability for faulty Bulbs (¬Bu): (10)

Example – based on TS in page 59  The probabilities for other faulty classes:  Faulty Ignition is the diagnosis with the highest probability.

Conclusion  Diagnosis is a classification problem, where: symptoms = attributes and faults = classes.  We learned several approaches: 1. Expert systems. 2. Model-based diagnosis. 3. Case-based reasoning. 4. Inductive learning systems. 5. Probabilistic reasoning.