Presentation is loading. Please wait.

Presentation is loading. Please wait.

ABML K NOWLEDGE R EFINEMENT L OOP A C ASE S TUDY Matej Guid, Martin Možina Vida Groznik, Aleksander Sadikov, Ivan Bratko Artificial Intelligence Laboratory.

Similar presentations


Presentation on theme: "ABML K NOWLEDGE R EFINEMENT L OOP A C ASE S TUDY Matej Guid, Martin Možina Vida Groznik, Aleksander Sadikov, Ivan Bratko Artificial Intelligence Laboratory."— Presentation transcript:

1 ABML K NOWLEDGE R EFINEMENT L OOP A C ASE S TUDY Matej Guid, Martin Možina Vida Groznik, Aleksander Sadikov, Ivan Bratko Artificial Intelligence Laboratory Faculty of Computer and Information Science University of Ljubljana, Slovenia Dejan Georgiev Zvezdan Pirtošek The 20th International Symposium on Methodologies for Intelligent Systems (ISMIS) 4-7 December 2012, Macau Department of Neurology, University Medical Centre Ljubljana, Slovenia

2 A BOUT Knowledge acquisition from data, using Machine Learning Goal of our approach: acquired knowledge makes sense to a human expert That is: formulated in human-like manner, and therefore easy to understand by a human ML technique: Argument Based Machine Learning, ABML (Možina, Žabkar, Bratko, 2007) Case study in neurology: diagnosis of tremors

3 T YPICAL P ROBLEM WITH ML May be fine for classification (diagnosis) But: hard to understand by expert, and may not provide sensible explanation Examples (patients) Learning program Induced knowledge (rules,...)

4 I LLUSTRATION Learning to diagnose pneumonia Patient no.TemperatureCoughingGenderPneumonia 1NormalNoFemaleNo 2HighYesMaleYes 3Very highNoMaleYes A typical rule learner would induce the rule: IF Gender = Male THEN Pneumonia = Yes This will correctly diagnose all patients But it will explain Patient 3 by: Has pneumonia because he is male

5 H OW TO F IX THIS WITH ABML? A possible experts explanation for Patient 2: Patient 2 suffers from pneumonia because he has high temperature In ABML we say : Expert annotated this case by positive argument The previous rule is now not consistent with the argument Patient no.TemperatureCoughingGenderPneumonia 1NormalNoFemaleNo 2HighYesMaleYes 3Very highNoMaleYes

6 ABML TAKES EXPERT S ARGUMENT INTO ACCOUNT ABML induces the rule consistent with experts argument: IF Temperature > Normal THEN Pneumonia = Yes This explains Patient 3 by: Has pneumonia because he has very high temperature Patient no.TemperatureCoughingGenderPneumonia 1NormalNoFemaleNo 2HighYesMaleYes 3Very highNoMaleYes

7 P OINT OF THIS I LLUSTRATION By annotating one chosen example, the expert leads the system to induce a rule compatible with experts knowledge The rule also covers other learning examples The rule enables sensible explanation of diagnoses

8 R EST OF T ALK ( BY M ATEJ G UID ) How to use ABML in interaction loop between expert and system? Case study in neurology Experimental evaluation of this approach

9 K NOWLEDGE E LICITATION WITH ABML IF... THEN...... ABML argument-based machine learning experts arguments constrain learning obtained models are consistent with expert knowledge Možina M. et al. Fighting Knowledge Acquisition Bottleneck with Argument Based Machine Learning. ECAI 2008. arguments critical examples counter examples experts introduce new concepts (attributes) human-understandable models

10 D IFFERENTIATING T REMORS : D OMAIN D ESCRIPTION Parkinsonian tremoressential tremormixed tremor Data set of 114 patients: learning set: 47 examples test set: 67 examples The patients were described by 45 attributes. 50 patients23 patients 41 patients resting tremor bradykinesia rigidity Parkinsonian spiral drawings... postural tremor kinetic tremor harmonics essential spiral drawings...

11 ABML K NOWLEDGE R EFINEMENT L OOP Step 1: Learn a hypothesis with ABML Step 2: Find the most critical example (if none found, stop) Step 3: Expert explains the example Return to step 1 critical example learn data set Argument ABML

12 ABML K NOWLEDGE R EFINEMENT L OOP Step 1: Learn a hypothesis with ABML Step 2: Find the most critical example (if none found, stop) Step 3: Expert explains the example Return to step 1 Step 3a: Explaining a critical example (in a natural language) Step 3b: Adding arguments to the example Step 3c: Discovering counter examples Step 3d: Improving arguments with counter examples Return to step 3c if counter example found

13 Which features are in favor of ET and which features are in favor of PT ? F IRST C RITICAL E XAMPLE E.65 MT the initial model could not explain why this example is from class MT B RADYKINESIA. LEFT = 4 B RADYKINESIA. RIGHT = 4 K INETIC. TREMOR. UP. LEFT = 1 K INETIC. TREMOR. UP. RIGHT = 0 P OSTURAL. TREMOR. UP. LEFT = 1 P OSTURAL. TREMOR. UP. RIGHT = 0 Q UALITATIVE. SPIRAL = ? R ESTING. TREMOR. UP. LEFT = 0 R ESTING. TREMOR. UP. RIGHT = 0 R IGIDITY. UP. LEFT = 3 R IGIDITY. UP. RIGHT = 3 B ARE. LEFT. FREQ. HARMONICS = NO B ARE. RIGHT. FREQ. HARMONICS = NO T EMPLATE. LEFT. FREQ. HARMONICS = YES T EMPLATE. RIGHT. FREQ. HARMONICS = NO...

14 Which features are in favor of ET and which features are in favor of PT ? F IRST C RITICAL E XAMPLE E.65 MT the initial model could not explain why this example is from class MT B RADYKINESIA. LEFT = 4 B RADYKINESIA. RIGHT = 4 K INETIC. TREMOR. UP. LEFT = 1 K INETIC. TREMOR. UP. RIGHT = 0 P OSTURAL. TREMOR. UP. LEFT = 1 P OSTURAL. TREMOR. UP. RIGHT = 0 Q UALITATIVE. SPIRAL = ? R ESTING. TREMOR. UP. LEFT = 0 R ESTING. TREMOR. UP. RIGHT = 0 R IGIDITY. UP. LEFT = 3 R IGIDITY. UP. RIGHT = 3 B ARE. LEFT. FREQ. HARMONICS = no B ARE. RIGHT. FREQ. HARMONICS = no T EMPLATE. LEFT. FREQ. HARMONICS = Y es T EMPLATE. RIGHT. FREQ. HARMONICS = no... H ARMONICS = true B RADYKINESIA = true the expert introduced new attributes

15 ET: H ARMONICS = true PT: B RADYKINESIA = true C OUNTER E XAMPLES E.65 MT the critical example arguments attached to E.65 E.67 (PT) E.12 (ET) Counter examples: What is the most important feature in favor of ET in E.65 that does not apply to E.67? What is the most important feature in favor of PT in E.65 that does not apply to E.12? answer: error in data – H ARMONICS in E.67 was now changed into false answer: misdiagnosis – some arguments in favor of PT in E.12 were overlooked E.12 changed from ET to MT

16 Which features are in favor of ET ? A NOTHER C RITICAL E XAMPLE E.61 MT the model could not explain why this example is from class MT B RADYKINESIA = false K INETIC. TREMOR. UP. LEFT = 0 K INETIC. TREMOR. UP. RIGHT = 0 P OSTURAL. TREMOR. UP. LEFT = 0 P OSTURAL. TREMOR. UP. RIGHT = 3 Q UALITATIVE. SPIRAL = Parkinsonian R ESTING. TREMOR. UP. LEFT = 0 R ESTING. TREMOR. UP. RIGHT = 0 R IGIDITY. UP. LEFT = 0 R IGIDITY. UP. RIGHT = 1 H ARMONICS = false... P OSTURAL = true R ESTING = false

17 ET: P OSTURAL = true R ESTING = false C OUNTER E XAMPLES E.61 MT the critical example argument attached to E.61 E.32 (PT) Counter example: What is the most important feature in favor of ET in E.61 that does not apply to E.32? answer: the absence of B RADYKINESIA in E.61 The argument in favor of ET in E.61 was thus extended to: P OSTURAL = true AND R ESTING = false AND B RADYKINESIA = false

18 ABML K NOWLEDGE R EFINEMENT L OOP Step 1: Learn a hypothesis with ABML Step 2: Find the most critical example (if none found, stop) Step 3: Expert explains the example Return to step 1

19 ABML K NOWLEDGE R EFINEMENT L OOP Step 1: Learn a hypothesis with ABML Step 2: Find the most critical example (if none found, stop) Step 3: Expert explains the example Return to step 1

20 ABML K NOWLEDGE R EFINEMENT L OOP Step 3a: Explaining a critical example (in a natural language) Presence of postural tremor and resting tremor speak in favor of ET... P OSTURAL. TREMOR. UP. LEFT = 0 P OSTURAL. TREMOR. UP. RIGHT = 3 R ESTING. TREMOR. UP. LEFT = 0 R ESTING. TREMOR. UP. RIGHT = 0 Step 3b: Adding arguments to the example P OSTURAL = true R ESTING = false Step 3c: Discovering counter examples ET: P OSTURAL = true R ESTING = false Step 3d: Improving arguments with counter examples P OSTURAL = true AND R ESTING = false AND B RADYKINESIA = false E.32 (PT)

21 T HE F INAL M ODEL pure distributions During the process of knowledge elicitation: 15 arguments were given by the expert 14 new attributes were included into the domain 21 attributes were excluded from the domain

22 C ONSISTENCY W ITH THE E XPERT S K NOWLEDGE Adequate a rule consistent with expert knowledge ready to be used as a strong argument in favor of ET or PT 2 Reasonable a rule consistent with expert knowledge insufficient to decide in favor of ET or PT on its basis alone 1 Counter-intuitive an illogical rule in contradiction with expert knowledge 0

23 T HE F INAL M ODEL no counter-intuitive rules pure distributions MethodInitial modelFinal Model Naive Bayes63%74% kNN58%81% Rule learning (ABCN2)52%82% classification accuracies on the test set: The accuracies of all methods improved by adding new attributes.

24 ABML R EFINEMENT L OOP & K NOWLEDGE E LICITATION IF... THEN...... ABML argument-based machine learning explain single example easier for experts to articulate knowledge critical examples expert provides only relevant knowledge counter examples detect deficiencies in explanations arguments critical examples counter examples

25 Q UESTIONS AND A NSWERS http://www.ailab.si/matej/ dr. Matej Guid. Artificial Intelligence Laboratory,Faculty of Computer and Information Science, University of Ljubljana. Web page: http://www.ailab.si/matej


Download ppt "ABML K NOWLEDGE R EFINEMENT L OOP A C ASE S TUDY Matej Guid, Martin Možina Vida Groznik, Aleksander Sadikov, Ivan Bratko Artificial Intelligence Laboratory."

Similar presentations


Ads by Google