Fuzzy Inference Systems. Review Fuzzy Models If then.

Slides:



Advertisements
Similar presentations
Artificial Intelligence
Advertisements

Multi-Layer Perceptron (MLP)
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
NEURAL NETWORKS Backpropagation Algorithm

Fuzzy Inference Systems
 Negnevitsky, Pearson Education, Lecture 5 Fuzzy expert systems: Fuzzy inference n Mamdani fuzzy inference n Sugeno fuzzy inference n Case study.
Fuzzy Inference and Defuzzification
Pattern Recognition and Machine Learning ( Fuzzy Sets in Pattern Recognition ) Debrup Chakraborty CINVESTAV.
Fuzzy Logic E. Fuzzy Inference Engine. “antecedent” “consequent”
EEE-8005 Industrial automation SDL Module leader: Dr. Damian Giaouris Room: E3.16 Phone: Module Leader.
Radial Basis Functions
Fuzzy Systems Adriano Cruz NCE e IM/UFRJ
Fuzzy Inference System Learning By Reinforcement Presented by Alp Sardağ.
Fuzzy Logic E. Fuzzy Inference Engine. “antecedent” “consequent”
Neuro-Fuzzy Control Adriano Joaquim de Oliveira Cruz NCE/UFRJ
Ming-Feng Yeh General Fuzzy Systems A fuzzy system is a static nonlinear mapping between its inputs and outputs (i.e., it is not a dynamic system).
ECE 530 – Analysis Techniques for Large-Scale Electrical Systems
Hazırlayan NEURAL NETWORKS Radial Basis Function Networks II PROF. DR. YUSUF OYSAL.
Fuzzy Rule-based Models *Neuro-fuzzy and Soft Computing - J.Jang, C. Sun, and, E. Mizutani, Prentice Hall 1997.
Fuzzy Logic. Sumber (download juga): 0logic%20toolbox.pdf
Indian Institute of Technology Bombay GIS-based mineral resource potential mapping - Modelling approaches  Exploration datasets with homogenous coverage.
What are Neuro-Fuzzy Systems A neuro-fuzzy system is a fuzzy system that uses a learning algorithm derived from or inspired by neural network theory to.
Hybrid intelligent systems:
Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.
Radial Basis Function Networks
Gradient Descent Rule Tuning See pp in text book.
Hybrid intelligent systems:
KE22 FINAL YEAR PROJECT PHASE 2 Modeling and Simulation of Milling Forces SIMTech Project Ryan Soon, Henry Woo, Yong Boon April 9, 2011 Confidential –
FAULT DIAGNOSIS OF THE DAMADICS BENCHMARK ACTUATOR USING NEURO-FUZZY SYSTEMS WITH LOCAL RECURRENT STRUCTURE FAULT DIAGNOSIS OF THE DAMADICS BENCHMARK ACTUATOR.
Fuzzy Rules 1965 paper: “Fuzzy Sets” (Lotfi Zadeh) Apply natural language terms to a formal system of mathematical logic
Extraction of Fetal Electrocardiogram Using Adaptive Neuro-Fuzzy Inference Systems Khaled Assaleh, Senior Member,IEEE M97G0224 黃阡.
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
© Negnevitsky, Pearson Education, Lecture 11 Neural expert systems and neuro-fuzzy systems Introduction Introduction Neural expert systems Neural.
Fuzzy Inference (Expert) System
Neural-Network-Based Fuzzy Logical Control and Decision System 主講人 虞台文.
Non-Bayes classifiers. Linear discriminants, neural networks.
ANFIS (Adaptive Network Fuzzy Inference system)
Map-Reduce for Machine Learning on Multicore C. Chu, S.K. Kim, Y. Lin, Y.Y. Yu, G. Bradski, A.Y. Ng, K. Olukotun (NIPS 2006) Shimin Chen Big Data Reading.
Fuzzy Inference Systems. Fuzzy inference (reasoning) is the actual process of mapping from a given input to an output using fuzzy logic. The process involves.
Fuzzy Inference Systems
Chapter 4: Fuzzy Inference Systems Introduction (4.1) Mamdani Fuzzy models (4.2) Sugeno Fuzzy Models (4.3) Tsukamoto Fuzzy models (4.4) Other Considerations.
KE22 FINAL YEAR PROJECT PHASE 3 Modeling and Simulation of Milling Forces SIMTech Project Ryan Soon, Henry Woo, Yong Boon April 9, 2011 Confidential –
EEE502 Pattern Recognition
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
TEMPLATE DESIGN © Classification of Magnetic Resonance Brain Images Using Feature Extraction and Adaptive Neuro-Fuzzy.
Chapter 10 FUZZY CONTROL Chi-Yuan Yeh.
Neural Networks The Elements of Statistical Learning, Chapter 12 Presented by Nick Rizzolo.
A Novel Integrated Protective Scheme for Transmission Line Using ANFIS Faculty of Engineering, Minia University, Minia, Egypt Slide 1 of 51.
Intro. ANN & Fuzzy Systems Lecture 11. MLP (III): Back-Propagation.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
Fuzzy Logic Control What is Fuzzy Logic ? Logic and Fuzzy Logic
Chapter 7. Classification and Prediction
FUZZY NEURAL NETWORKS TECHNIQUES AND THEIR APPLICATIONS
Fuzzy expert systems Fuzzy inference Mamdani fuzzy inference
Introduction to Fuzzy Logic
Fuzzy logic Introduction 3 Fuzzy Inference Aleksandar Rakić
منطق فازی.
Neural Networks and Backpropagation
Dr. Unnikrishnan P.C. Professor, EEE
Dr. Unnikrishnan P.C. Professor, EEE
Chapter 12 Curve Fitting : Fitting a Straight Line Gab-Byung Chae
Lecture 11. MLP (III): Back-Propagation
فازی سازی و غیرفازی سازی
Lecture 5 Fuzzy expert systems: Fuzzy inference
Discrete Least Squares Approximation
Dr. Unnikrishnan P.C. Professor, EEE
Hybrid intelligent systems:
Fuzzy Inference Systems
Presentation transcript:

Fuzzy Inference Systems

Review Fuzzy Models If then.

FuzzificationDefuzzification Inferencing InputOutput Basic Configuration of a Fuzzy Logic System Target Error =Target -Output

Types of Rules Mamdani Assilian Model R1: If x is A 1 and y is B 1 then z is C 1 R2: If x is A 2 and y is B 2 then z is C 2 A i, B i and C i, are fuzzy sets defined on the universes of x, y, z respectively Takagi-Sugeno Model R1: If x is A 1 and y is B 1 then z =f 1 (x,y) R1: If x is A 2 and y is B 2 then z =f 2 (x,y) For example: f i (x,y)=a i x+b i y+c i

Types of Rules Mamdani Assilian Model Takagi-Sugeno Model

Mamdani Fuzzy Models

The Reasoning Scheme Both antecedent and consequent are fuzzy

The Reasoning Scheme Both antecedent and consequent are fuzzy

1: IF FeO is high & SiO 2 is low & Granite is prox & Fault is prox, THEN metal is highImplication (Max) = 2: IF FeO is aver & SiO 2 is high & Granite is interm & Fault is prox, THEN metal is aver 30% 50% 70% % 55% 70% 0 km 10 km 20km 0 km 5 km 10km 0t 100t 1000t 3: IF FeO is low & SiO 2 is high & Granite is dist & Fault is dist, THEN metal is low FeO = 60%SiO 2 = 60% Granite = 5 km Fault = 1 km Metal = ? 0t 100t 1000t = =

Defuzzifier Converts the fuzzy output of the inference engine to crisp using membership functions analogous to the ones used by the fuzzifier. Five commonly used defuzzifying methods: –Centroid of area (COA) –Bisector of area (BOA) –Mean of maximum (MOM) –Smallest of maximum (SOM) –Largest of maximum (LOM) Since consequent is fuzzy, it has to be defuzzified

Defuzzifier

Rule 1: Rule 2: Rule 3: Aggregate (Max) + + = Defuzzify (Find centroid) 125 tonnes metal Formula for centroid

Sugeno Fuzzy Models Also known as TSK fuzzy model –Takagi, Sugeno & Kang, 1985

If x is A and y is B then z = f(x, y) Fuzzy Rules of TSK Model Fuzzy Sets Crisp Function f(x, y) is very often a polynomial function w.r.t. x and y. The order of a Takagi-Sugeno type fuzzy inference system = the order of the polynomial used. While antecedent is fuzzy, consequent is crisp

The Reasoning Scheme

Examples R1: if X is small and Y is small then z =  x +y +1 R2: if X is small and Y is large then z =  y +3 R3: if X is large and Y is small then z =  x +3 R4: if X is large and Y is large then z = x + y + 2

TAKAGI-SUGENO SYSTEM 1.IF x is f 1x (x) AND y is f 1y (y) THEN z 1 = p 10 +p 11 x+p 12 y 2.IF x is f 2x (x) AND y is f 1y (y) THEN z 2 = p 20 +p 21 x+p 22 y 3.IF x is f 1x (x) AND y is f 2y (y) THEN z 3 = p 30 +p 31 x+p 32 y 4.IF x is f 2x (x) AND y is f 2y (y) THEN z 4 = p 40 +p 41 x+p 42 y The firing strength (= output of the IF part) of each rule is: s 1 = f 1x (x) AND f 1y (y) s 2 = f 2x (x) AND f 1y (y) s 3 = f 1x (x) AND f 2y (y) s 4 = f 2x (x) AND f 2y (y) Output of each rule (= firing strength x consequent function) : 1.o 1 = s 1 ∙ z 1 2.o 2 = s 2 ∙ z 2 3.o 3 = s 3 ∙ z 3 4.o 4 = s 4 ∙ z 4 Overall output of the fuzzy inference system is: o 1 + o 2 + o 3 + o 4 s 1 + s 2 + s 3 + s 4 z =

Sugeno system 18 Rule1: IF FeO is high AND SiO 2 is low AND Granite is proximal AND Fault is proximal, THEN Gold =p 1 (FeO%)+q 1 (SiO 2 %) +r 1 (Distance2Granite)+s 1 (Distance2Fault)+t 1 Rule 2: IF FeO is average AND SiO 2 is high AND Granite is intermediate AND Fault is proximal, THEN Gold =p 2 (FeO%)+q 2 (SiO 2 %)+r 2 (Distance2Granite)+s 2 (Distance2Fault)+t 2 Rule 3: IF FeO is low AND SiO 2 is high AND Granite is distal AND Fault is distal, THEN Gold =p 3 (FeO%)+q 3 (SiO 2 %)+r 3 (Distance2Granite)+s 3 (Distance2Fault)+t 3

Gold(R1) =p 1 (FeO%)+q 1 (SiO 2 %) + r 1 (Distance2Granite) +s 1 (Distance2Fault)+t 1 1: IF FeO is high X SiO 2 is low X Granite is prox X Fault is prox, THEN : IF FeO is aver X SiO 2 is high X Granite is interm X Fault is prox, THEN 30% 50% 70% % 55% 70% 0 km 10 km 20km 0 km 5 km 10km 3: IF FeO is low & SiO 2 is high & Granite is dist & Fault is dist, THEN FeO = 60%SiO 2 = 60% Granite = 5 km Fault = 1 km Metal = ? s1s1 Gold(R2) =p 2 (FeO%)+q 2 (SiO 2 %) + r 2 (Distance2Granite) +s 2 (Distance2Fault)+t 2 s2s2 Gold(R3) =p 3 (FeO%)+q 3 (SiO 2 %) + r 3 (Distance2Granite) +s 3 (Distance2Fault)+t 3 s3s3 Sugeno system

Sugeno system: Output Gold(R1) =p 1 (FeO%)+q 1 (SiO 2 %) + r 1 (Distance2Granite) +s 1 (Distance2Fault)+t 1 s1s1 Gold(R2) =p 2 (FeO%)+q 2 (SiO 2 %) + r 2 (Distance2Granite) +s 2 (Distance2Fault)+t 2 s2s2 Gold(R3) =p 3 (FeO%)+q 3 (SiO 2 %) + r 3 (Distance2Granite) +s 3 (Distance2Fault)+t 3 s3s3 Firing strength Rule output

A neural fuzzy system Implements FIS in the framework of NNs Fuzzification Nodes Antecedent Nodes Output Nodes xy

Fuzzification Nodes Represents the term sets of the features. If we have two features x and y and two linguistic variables defined on both of it say BIG and SMALL. Then we have 4 fuzzification nodes. xy BIG SMALL We use Gaussian Membership functions for fuzzification --- They are differentiable, triangular and trapezoidal membership functions are NOT differentiable.

Fuzzification Nodes (Contd.)  and  are two free parameters of the membership functions which needs to be determined How to determine  and  Two strategies: 1) Fixed  and  2) Update  and , through any tuning algorithm

Consequent nodes p, q and k are three free parameters of the consequent polynomial function How to determine p, q, k Two strategies: 1) Fixed 2) Update through any tuning algorithm

Fuzzification nodes xy BIG SMALL μ x1 μ x2 μ y1 μ y2 Antecedent nodes e.g. If x is Small & y is Small Consequent nodes w1w1 w2w2 w3w3 w4w4 e.g. z 4 = p 4 x + q 4 y + k 4 z1z1 z2z2 z3z3 z4z4 Output node O = (w 1 z 1 +w 2 z 2 +w 3 z 3 +w 4 z 4 )/ (w 1 +w 2 +w 3 +w 4 Target (t) Error = ½(t-o) 2

ANFIS Architecture Squares: Adaptive nodes Circles: Fixed nodes

ANFIS Architecture Layer 1 (Adaptive) Contains adaptive nodes, each with a Gaussian membership function: Number of nodes = number of variables x number of linguistic values In the previous example there are 4 nodes (2 variable x 2 linguistic values for each) Two parameters to be estimated per node: mean (centre) and standard deviation (spread) These are called premise parameters Number of premise parameters = 2 x number of nodes = 8 in the example

ANFIS Architecture Layer 2 (Fixed) Contains fixed nodes, each with product operator (T-norm operator). Returns the firing strength of each If-Then Rule. The firing strength can be normalized. In ANFIS, each node returns a normalized firing strength – Fixed nodes – no parameter to be estimated.

ANFIS Architecture Layer 3 (Adaptive) Each node contains an adaptive polynomial, and returns output for each fuzzy If-Then rule Number of nodes = number of If-Then Rules. The parameters ps are called consequent parameters.

ANFIS Architecture Layer 4 (Fixed) Sums up the output of each node in the previous layer: A single node in this layer. No parameter to be estimated.

ANFIS Training Linear in the consequent parameters P ki, if the premise parameters and, therefore, the firing strengths s k of the fuzzy if-then rules are fixed. ANFIS uses a hybrid learning procedure (Jang and Sun, 1995) for estimation of the premise and consequent parameters. The hybrid learning procedure estimates the consequent parameters (keeping the premise parameters fixed) in a forward pass and the premise parameters (keeping the consequent parameters fixed) in a backward pass.

Squares: Adaptive nodes Circles: Fixed nodes The forward pass: Propagate information forward until Layer 3 Estimate the consequent parameters by the least square estimator. The backward pass: Propagate the error signals backwards and update the premise parameters by gradient descent. ANFIS Training

ANFIS Training : Least Square Estimation 1.Data assembled in form of (x n ; y n ) 2.We assume that there is a linear relation between x and y: y = ax + b 3.Can be extended to n dimensions: y = a 1 x 1 + a 2 x 2 + a 3 x 3 + … + b The problem: Given the function f, find values of coefficients a i s such that the linear combination best fits the data

ANFIS Training : Least Square Estimation Given data {(x 1 ; y 1 (x N ; y N )}, we may define the error associated to saying y = ax + b by: This is just N times the variance of data : {y1 - (ax1+b),…., y n - ( ax N +b)} The goal is to find values of a and b that minimize the error. In other words minimize the partial derivative of the error wrt a and b:

ANFIS Training : Least Square Estimation Which gives us: We may rewrite them as: The values of a and b which minimize the error satisfy the following matrix equation: Hence a and b are estimated using:

ANFIS Training : Least Square Estimation For the following data find least square estimator SNoXYX2X2 XY TOTAL

ANFIS Training : Least Square Estimation

ANFIS Training : Gradient descent