Pour insérer une image : Menu « Insertion / Image » ou Cliquer sur l’icône de la zone image Pour personnaliser « nom événement et auteur » : « Insertion.

Slides:



Advertisements
Similar presentations
Applications of one-class classification
Advertisements

Pseudo-Relevance Feedback For Multimedia Retrieval By Rong Yan, Alexander G. and Rong Jin Mwangi S. Kariuki
Pour personnaliser « nom événement et auteur » : « Insertion / En-tête et pied de page » Personnaliser la zone de de pied de page Cliquer sur appliquer.
Design of Experiments Lecture I
Salvatore giorgi Ece 8110 machine learning 5/12/2014
Support Vector Machines
Introduction to Fuzzy Control Lecture 10.1 Appendix E.
NCSR “DEMOKRITOS” Institute of Nuclear Technology and Radiation Protection NATIONAL TECHNICAL UNIVERSITY OF ATHENS School of Chemical Engineering Fuzzy.
An Overview of Machine Learning
Fuzzy Sets and Applications Introduction Introduction Fuzzy Sets and Operations Fuzzy Sets and Operations.
AI TECHNIQUES Fuzzy Logic (Fuzzy System). Fuzzy Logic : An Idea.
Paper Discussion: “Simultaneous Localization and Environmental Mapping with a Sensor Network”, Marinakis et. al. ICRA 2011.
ETHEM ALPAYDIN © The MIT Press, Lecture Slides for.
Prénom Nom Document Analysis: Data Analysis and Clustering Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Supervised learning: Mixture Of Experts (MOE) Network.
AI – CS364 Hybrid Intelligent Systems Overview of Hybrid Intelligent Systems 07 th November 2005 Dr Bogdan L. Vrusias
Fuzzy Medical Image Segmentation
1 Chapter 18 Fuzzy Reasoning. 2 Chapter 18 Contents (1) l Bivalent and Multivalent Logics l Linguistic Variables l Fuzzy Sets l Membership Functions l.
Scalable Text Mining with Sparse Generative Models
Introduction to Artificial Neural Network and Fuzzy Systems
On the Application of Artificial Intelligence Techniques to the Quality Improvement of Industrial Processes P. Georgilakis N. Hatziargyriou Schneider ElectricNational.
THE MODEL OF ASIS FOR PROCESS CONTROL APPLICATIONS P.Andreeva, T.Atanasova, J.Zaprianov Institute of Control and System Researches Topic Area: 12. Intelligent.
Statistical Natural Language Processing. What is NLP?  Natural Language Processing (NLP), or Computational Linguistics, is concerned with theoretical.
ADVANCED SIMULATION OF ULTRASONIC INSPECTION OF WELDS USING DYNAMIC RAY TRACING Audrey GARDAHAUT (1), Karim JEZZINE (1), Didier CASSEREAU (2), Nicolas.
SAISINUC SOFTWARE ENVIRONMENT Data tools: import, check, export, use DDEP , October 8-10 Marie-Martine Bé, Charlène Bisch, Christophe Dulieu, Mark.
Competence Centre on Information Extraction and Image Understanding for Earth Observation Matteo Soccorsi (1) and Mihai Datcu (1,2) A Complex GMRF for.
Soft Computing Lecture 20 Review of HIS Combined Numerical and Linguistic Knowledge Representation and Its Application to Medical Diagnosis.
INTRODUCTION TO MACHINE LEARNING. $1,000,000 Machine Learning  Learn models from data  Three main types of learning :  Supervised learning  Unsupervised.
FAULT DIAGNOSIS OF THE DAMADICS BENCHMARK ACTUATOR USING NEURO-FUZZY SYSTEMS WITH LOCAL RECURRENT STRUCTURE FAULT DIAGNOSIS OF THE DAMADICS BENCHMARK ACTUATOR.
10/6/2015 1Intelligent Systems and Soft Computing Lecture 0 What is Soft Computing.
Machine Learning in Spoken Language Processing Lecture 21 Spoken Language Processing Prof. Andrew Rosenberg.
COMMON EVALUATION FINAL PROJECT Vira Oleksyuk ECE 8110: Introduction to machine Learning and Pattern Recognition.
ECE 8443 – Pattern Recognition Objectives: Bagging and Boosting Cross-Validation ML and Bayesian Model Comparison Combining Classifiers Resources: MN:
 Definition Definition  Bit of History Bit of History  Why Fuzzy Logic? Why Fuzzy Logic?  Applications Applications  Fuzzy Logic Operators Fuzzy.
Kernel Methods A B M Shawkat Ali 1 2 Data Mining ¤ DM or KDD (Knowledge Discovery in Databases) Extracting previously unknown, valid, and actionable.
Jun-Won Suh Intelligent Electronic Systems Human and Systems Engineering Department of Electrical and Computer Engineering Speaker Verification System.
Automatic Image Annotation by Using Concept-Sensitive Salient Objects for Image Content Representation Jianping Fan, Yuli Gao, Hangzai Luo, Guangyou Xu.
Fall  Types of Uncertainty 1. Randomness : Probability Knowledge about the relative frequency of each event in some domain Lack of knowledge which.
Fuzzy Inference Systems. Fuzzy inference (reasoning) is the actual process of mapping from a given input to an output using fuzzy logic. The process involves.
MACHINE LEARNING 10 Decision Trees. Motivation  Parametric Estimation  Assume model for class probability or regression  Estimate parameters from all.
Fuzzy Inference Systems
Radial Basis Function ANN, an alternative to back propagation, uses clustering of examples in the training set.
Clustering Instructor: Max Welling ICS 178 Machine Learning & Data Mining.
Manufacturing contracts for RFQ
Final Exam Review CS479/679 Pattern Recognition Dr. George Bebis 1.
International Conference on Fuzzy Systems and Knowledge Discovery, p.p ,July 2011.
October 1st 2015 Alexis Fouché 1, Florian Noyrit 1, Sébastien Gérard 1, Maged Elaasar 2 SYSTEMATIC GENERATION OF STANDARD COMPLIANT TOOL SUPPORT OF DIAGRAMMATIC.
RISK REGISTER Management & Engineering
Instructor : Dr. Powsiri Klinkhachorn
INTERFACES MANAGEMENT CRYOMODULES Vincent HENNION SYSTEM ENGINEERING ACTIVITIES.
DISCRIMINATIVELY TRAINED DENSE SURFACE NORMAL ESTIMATION ANDREW SHARP.
Hierarchical Mixture of Experts Presented by Qi An Machine learning reading group Duke University 07/15/2005.
Computational Intelligence: Methods and Applications Lecture 26 Density estimation, Expectation Maximization. Włodzisław Duch Dept. of Informatics, UMK.
Introduction of Fuzzy Inference Systems By Kuentai Chen.
SUPERVISED AND UNSUPERVISED LEARNING Presentation by Ege Saygıner CENG 784.
RF PROCESSING OF RFQ COUPLERS CDR2 | Michel Desmons 8-9 DEC 2015.
The Capability Approach as an Evaluation Framework for ICT for Older Adults ENTRANCE | Margarita Anastassova, Sabrina Paneëls, Verena Fuchsberger, Christiane.
DISCUSSION OF THE ARCS AND OVERALL LATTICE INTEGRATION FCC meeting 8th of January 2015 Antoine Chance.
1 & 2 JUNE 2015 – LLRF – BEAM DYNAMICS WORKSHOP URIOT Didier What is taken into account in simulations LLRF – Beam dynamics Workshop.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Artificial Intelligence
Fuzzy Logic and Fuzzy Sets
Dr. Unnikrishnan P.C. Professor, EEE
MID: A MetaCASE Tool for a Better Reuse of Visual Notations
Probabilistic Models with Latent Variables
Intelligent Systems and
Fuzzy rule-based system derived from similarity to prototypes
Dr. Unnikrishnan P.C. Professor, EEE
Clustering (2) & EM algorithm
Fuzzy Logic KH Wong Fuzzy Logic v.9a.
Presentation transcript:

Pour insérer une image : Menu « Insertion / Image » ou Cliquer sur l’icône de la zone image Pour personnaliser « nom événement et auteur » : « Insertion / En-tête et pied de page » Personnaliser la zone de de pied de page Cliquer sur appliquer partout FUZZY INFERENCE SYSTEM AND LEARNING 08 JULY 2014 DataSense Digicosme | CORNEZ Laurence

PLAN | PAGE 2 DataSense Digicosme | CORNEZ Laurence I.Brief introduction on Fuzzy Logic and Fuzzy inference system (FIS) II.Real context and database III.Fuzzy rules and Sugeno’s classifier IV.Implementation in three steps VI.Visualization of FIS obtained VII.Perspectives in terms of intelligibility and performance

Fuzzy logic and applications 1965, Zadeh proposes fuzzy concept : one object can be simultaneously in two different classes This allows the imperfections (natural language), imprecisions and uncertainties (data) Applications (since 1974): washing machine, ABS, autofocus camera…) | PAGE 3 DataSense Digicosme | CORNEZ Laurence

Fuzzy expert systems Goal: three parts to reproduce the cognitive reasoning of an expert: Rules base: -expression of the knowledge of the expert through « If-then » inference rules. -directly expressed by expert or learnt via databases Inputs Inference engin able to integrate these rules and these inputs to produce specific outputs. | PAGE 4 DataSense Digicosme | CORNEZ Laurence Inference engin inputs outputs

Fuzzy inference system: example Rules base (Jang97) IF temperature=low THEN cooling valve=half open IF temperature=medium THEN cooling valve=almost open input moteur d’inférence 18° low 1 0 half open medium 1 0 almost open T° d% 0,2 0,5 1 0 d% Implementation DataSense Digicosme | CORNEZ Laurence | PAGE 5 70%

WORK POSITION / DEFINITION DataSense Digicosme | CORNEZ Laurence | PAGE 6

Seismicity map (+/- France) | PAGE 7 DataSense Digicosme | CORNEZ Laurence Earthquakes Marine explosions Quarry blasts Rock bursts How to class a new event automatically with good interpretability for the expert ?

Database stutied  French seismic metropolitan data from 1997 and 2003  Inputs (high level features):  Hour : circular variable [0;24]  Latitude : quantitative variable [42;51]  Longitude : quantitative variable [-5;9]  Magnitude : quantitative variable [0.7;6.0]  Date : qualitative variable with 3 modalities {Working day, Saturday, Sunday and bank holiday}  Classification output (3 possible classes):  Earthquakes (9349 events)  Quarry blasts (3485 events)  Rock bursts (1075 events) | PAGE 8 DataSense Digicosme | CORNEZ Laurence

Model proposed | PAGE 9 DataSense Digicosme | CORNEZ Laurence  Aggregation of rules (Sugeno order 0) - If magnitude is middle and event is nocturnal then event is earthquake - If magnitude is high then event is surely earthquake How to generate these rules automatically ? An example of the input space Weight of the rule k Membership degree of x to the rule k Issue of the rule k (unit vector)  Sugeno’s classifier (normalized) defined as:

MODEL IMPLEMENTATION DataSense Digicosme | CORNEZ Laurence | PAGE 10

Model implementation: first step (1/2) | PAGE 11 DataSense Digicosme | CORNEZ Laurence  Soft Clustering = modelling class density by gaussian mixture  Mountain clustering (Chiu 94) The algorithm learns : Gaussian number Location of gaussian centers magnitude hour

Model implementation: first step (2/2) | PAGE 12 DataSense Digicosme | CORNEZ Laurence  Results:  Good classification rate with « winner takes all » method  5-fold cross-validation databases MethodLearning rate (%)Test rate (%)Cluster number 5 quantitatives (quali. va. misused) / / ; 36 ; 35 ; 38 ; 37 4 quantitatives / / ; 26 ; 28 ; 28 ; 29 What about the qualitative variable ? Similar good classification rates Less clusters

Model implementation: second step (1/2) | PAGE 13 DataSense Digicosme | CORNEZ Laurence Probability estimations of each modality for each cluster With: Associated Sugeno’s classifier

Model implementation: second step (2/2) | PAGE 14 DataSense Digicosme | CORNEZ Laurence MethodLearning rate (%)Test rate (%) 4 quanti. + 1 quali / /  Results after step II: Good classification rates not significantly improved Well classified point Ill classified point One cluster Semi optimal (cluster juxtapositions) Not optimal (absence of clusters)

Model implementation: third step (1/2) | PAGE 15 DataSense Digicosme | CORNEZ Laurence  Improvement of parameters with EM « Expectation- Maximization » (Jordan et Jacobs 1993)  Input space is virtually augmented by adding a hidden variable, the cluster of interest  EM garantees improvement after each step  Computation of new parameters: ={weights, centers and standard deviations }

Model implementation: third step (2/2) | PAGE 16 DataSense Digicosme | CORNEZ Laurence  Results after step III:  50 iterations for EM  The same 5-fold cross-validation database MethodLearning rate (%)Test rate (%) 4 quant.+ 1 qual / / Improvement for good classification rate significant improvement for cluster locations

Visualization | PAGE 17 DataSense Digicosme | CORNEZ Laurence X sum X X X Estim. Proba GaussiansRule output Weight Product One rule One example Class of the example Classe decided EQ [ 92.70% 0.00% 7.32%]

PERSPECTIVES IN TERMS IN INTELLIGIBILITY AND PERFORMANCE DataSense Digicosme | CORNEZ Laurence | PAGE 18

FIS 95,19% DT 94,88% Fuzzy DT 95,19% Comparison with previous works | PAGE 19 DataSense Digicosme | CORNEZ Laurence 1998 : S. Muller fuzzy Controller codage MLP 92.5% well classified 1999 : F. Gravot FIS - Mixture of gausians - Gradient-based descent 90,5% well classified 2005 : R. Quach et D. Mercier fuzzy controller codage MLP : 95,9% well classified SVM : 96,5% well classified Intelligibility Performance cNF+RN 92,5% cNF+MLP/SVM ~96% FIS 90,5% 2006 : L. Cornez DT 94,88% well classified 95,19% well classified 2007 : L. Cornez FIS 3 steps 95,19% well classified Objective

How improve intelligibility ? | PAGE 20 DataSense Digicosme | CORNEZ Laurence 20 According to fold cross validation database, the coverage is different

Improve intelligibility and stability | PAGE 21 DataSense Digicosme | CORNEZ Laurence  More the model is stable and more the model fit with cognitive representation more the expert can accept it  Generative Gaussian Graph (M. Aupetit) to identify complex clusters

CEA Tech Département Métrologie, Instrumentation et Information Laboratoire d’Analyse de Données et Intelligence des Systèmes Commissariat à l’énergie atomique et aux énergies alternatives Institut Carnot CEA LIST Centre de Saclay | Gif-sur-Yvette Cedex T. +33 (0) Etablissement public à caractère industriel et commercial | RCS Paris B | PAGE 22 DataSense Digicosme | CORNEZ Laurence THANKS ! QUESTIONS ?

Best fuzzy decision tree | PAGE 23 DataSense Digicosme | CORNEZ Laurence