Presentation is loading. Please wait.

Presentation is loading. Please wait.

A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence.

Similar presentations


Presentation on theme: "A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence."— Presentation transcript:

1 A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence and Information Systems, Kharkov State Technical University of Radioelectronics, UKRAINE e-mail: vagan@kture.cit-ua.net, vagan@jytko.jyu.fi **Department of Computer Science and Information Systems, University of Jyvaskyla, FINLAND, e-mail: sepi@jytko.jyu.fi STeP’98 - Finnish AI Conference, 7-9 September, 1998

2 Finland and Ukraine

3 Metaintelligence Laboratory: Research Topics Knowledge and metaknowledge engineering; Multiple experts; Context in Artificial Intelligence; Data Mining and Knowledge Discovery; Temporal Reasoning; Metamathematics; Semantic Balance and Medical Applications; Distance Education and Virtual Universities.

4 Contents What is Knowledge Discovery ? The Multiple Classifiers Problem A Sample (Training) Set A Sliding Exam of Classifiers as Learning Technique A locality Principle Nearest Neighbours and Distance Measure Weighting Neighbours, Predicting Errors and Selecting Classifiers Data Preprocessing Some Examples

5 What is Knowledge Discovery ? Knowledge discovery in databases (KDD) is a combination of data warehousing, decision support, and data mining and it is an innovative new approach to information management. KDD is an emerging area that considers the process of finding previously unknown and potentially interesting patterns and relations in large databases*. __________________________________________________________________________________________________________________________________________ * Fayyad, U., Piatetsky-Shapiro, G., Smyth, P., Uthurusamy, R., Advances in Knowledge Discovery and Data Mining, AAAI/MIT Press, 1996.

6 The Research Problem During the past several years, in a variety of application domains, researchers in machine learning, computational learning theory, pattern recognition and statistics have tried to combine efforts to learn how to create and combine an ensemble of classifiers. The primary goal of combining several classifiers is to obtain a more accurate prediction than can be obtained from any single classifier alone.

7 Approaches to Integrate Multiple Classifiers Integrating Multiple Classifiers Selection Combination Global (Static) Local (Dynamic) Local (“Virtual” Classifier) Global (Voting-Type) Decontextualization

8 Classification Problem Given: n training pairs (x i, y i ) with x i  R p and y i  {1,…,J} denoting class membership Goal:given: new x 0 select classifier for x 0 predict class y 0 J classes, n training observations, p object features

9 A Sample (Training) Set

10 Classifiers Used in Example Classifier 1: LDA - Linear Discriminant Analysis; Classifier 2: k-NN - Nearest Neighbour Classification; Classifier 3: DANN - Discriminant Adaptive Nearest Neighbour Classification

11 A Sliding Exam of Classifiers (Jackknife Method) : We apply all the classifiers to the Training Set points and check correctness of classification

12 A Locality Principle

13 Selecting Amount of Nearest Neighbours A suitable amount l of nearest neighbours for a training set point should be selected, which will be used to classify case related to this point. We have used l = max(3, n div 50) for all training set points in the example, where n is the amount of cases in a training set. ?? Should we locally select an appropriate l value ?

14 Brief Review of Distance Functions According to D. Wilson and T. Martinez (1997)

15 Weighting Neighbours

16 Nearest Neighbours’ Weights in the Example

17 Selection of a Classifier DANN should be selected

18 Compenetnce Map of Classifiers

19 Data Preprocessing: Selecting Set of Features

20 Features Used in Dystonia Diagnostics AF(x1) - attack frequency; AM0 (x2) - the mode, the index of sympathetic tone; dX(x3) - the index of parasympathetic tone; IVR(x4) - the index of autonomous reactance; V(x5) - the velocity of brain blood circulation; GPVR(x6) - the general peripheral blood-vessels’ resistance ; RP(x7) - the index of brain vessels’ resistance.

21 Training Set for a Dystonia Diagnostics

22 Visualizing Training Set for the Dystonia Example

23 Evaluation of Classifiers

24 Diagnostics of the Test Vector

25 Experiments with Heart Disease Database Database contains 270 instances. Each instance has 13 attributes which have been extracted from a larger set of 75 attributes. The average cross-validation errors for the three classification methods were the following: DANN0.196, K-NN0.352, LDA0.156, Dynamic Classifier Selection Method0.08

26 Experiments with Liver Disorders Database Database contains 345 instances. Each instance has 7 numerical attributes. The average cross-validation errors for the three classification methods were the following: DANN0.333, K-NN0.365, LDA0.351, Dynamic Classifier Selection Method0.134

27 Local (Dynamic) Classifier Selection (DCS) is compared with Voting and static Cross-Validation Majority Experimental Comparison of Three Integration Techniques

28 Conclusion and Future Work Classifiers can be effectively selected or integrated due to the locality principle The same principle can be used when preprocessing data The amount of nearest neighbours and the way of distance measure it is reasonable decided in every separate case The difference between classification results obtained in different contexts can be used to improve classification due to possible trends


Download ppt "A Technique for Advanced Dynamic Integration of Multiple Classifiers Alexey Tsymbal*, Seppo Puuronen**, Vagan Terziyan* *Department of Artificial Intelligence."

Similar presentations


Ads by Google