Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10.

Slides:



Advertisements
Similar presentations
Multi-Layer Perceptron (MLP)
Advertisements

Slides from: Doug Gray, David Poole
NEURAL NETWORKS Backpropagation Algorithm
1 Machine Learning: Lecture 4 Artificial Neural Networks (Based on Chapter 4 of Mitchell T.., Machine Learning, 1997)
Fuzzy Inference Systems. Review Fuzzy Models If then.
Artificial Neural Networks - Introduction -
Machine Learning: Connectionist McCulloch-Pitts Neuron Perceptrons Multilayer Networks Support Vector Machines Feedback Networks Hopfield Networks.
Machine Learning Neural Networks
Computer Intelligence and Soft Computing
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
Neural Networks Basic concepts ArchitectureOperation.
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neuro-fuzzy modeling between minimum and maximum Claudio Moraga University of Dortmund Germany © cm Universidad Técnica Federico Santa María Valparaíso,
Prénom Nom Document Analysis: Artificial Neural Networks Prof. Rolf Ingold, University of Fribourg Master course, spring semester 2008.
Neuro-Fuzzy Control Adriano Joaquim de Oliveira Cruz NCE/UFRJ
Back-Propagation Algorithm
Chapter 6: Multilayer Neural Networks
Artificial Neural Networks
Artificial Neural Networks
CHAPTER 11 Back-Propagation Ming-Feng Yeh.
Image Compression Using Neural Networks Vishal Agrawal (Y6541) Nandan Dubey (Y6279)
Neural Networks. Background - Neural Networks can be : Biological - Biological models Artificial - Artificial models - Desire to produce artificial systems.
Hybrid intelligent systems:
Radial Basis Function Networks
CHAPTER 12 ADVANCED INTELLIGENT SYSTEMS © 2005 Prentice Hall, Decision Support Systems and Intelligent Systems, 7th Edition, Turban, Aronson, and Liang.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Artificial Neural Networks
Hybrid intelligent systems:
Artificial Neural Networks (ANN). Output Y is 1 if at least two of the three inputs are equal to 1.
Artificial Neural Networks
Extraction of Fetal Electrocardiogram Using Adaptive Neuro-Fuzzy Inference Systems Khaled Assaleh, Senior Member,IEEE M97G0224 黃阡.
Chapter 9 Neural Network.
Chapter 3 Neural Network Xiu-jun GONG (Ph. D) School of Computer Science and Technology, Tianjin University
11 CSE 4705 Artificial Intelligence Jinbo Bi Department of Computer Science & Engineering
1 Chapter 6: Artificial Neural Networks Part 2 of 3 (Sections 6.4 – 6.6) Asst. Prof. Dr. Sukanya Pongsuparb Dr. Srisupa Palakvangsa Na Ayudhya Dr. Benjarath.
Appendix B: An Example of Back-propagation algorithm
Pattern Classification All materials in these slides were taken from Pattern Classification (2nd ed) by R. O. Duda, P. E. Hart and D. G. Stork, John Wiley.
© Negnevitsky, Pearson Education, Lecture 11 Neural expert systems and neuro-fuzzy systems Introduction Introduction Neural expert systems Neural.
NEURAL NETWORKS FOR DATA MINING
 Diagram of a Neuron  The Simple Perceptron  Multilayer Neural Network  What is Hidden Layer?  Why do we Need a Hidden Layer?  How do Multilayer.
ECE 8443 – Pattern Recognition ECE 8527 – Introduction to Machine Learning and Pattern Recognition LECTURE 16: NEURAL NETWORKS Objectives: Feedforward.
Classification / Regression Neural Networks 2
LINEAR CLASSIFICATION. Biological inspirations  Some numbers…  The human brain contains about 10 billion nerve cells ( neurons )  Each neuron is connected.
Artificial Neural Networks. The Brain How do brains work? How do human brains differ from that of other animals? Can we base models of artificial intelligence.
Neural Networks and Machine Learning Applications CSC 563 Prof. Mohamed Batouche Computer Science Department CCIS – King Saud University Riyadh, Saudi.
Multi-Layer Perceptron
Non-Bayes classifiers. Linear discriminants, neural networks.
ANFIS (Adaptive Network Fuzzy Inference system)
Back-Propagation Algorithm AN INTRODUCTION TO LEARNING INTERNAL REPRESENTATIONS BY ERROR PROPAGATION Presented by: Kunal Parmar UHID:
CS621 : Artificial Intelligence
Neural Networks Teacher: Elena Marchiori R4.47 Assistant: Kees Jong S2.22
Chapter 8: Adaptive Networks
Hazırlayan NEURAL NETWORKS Backpropagation Network PROF. DR. YUSUF OYSAL.
Introduction to Neural Networks Freek Stulp. 2 Overview Biological Background Artificial Neuron Classes of Neural Networks 1. Perceptrons 2. Multi-Layered.
Data Mining: Concepts and Techniques1 Prediction Prediction vs. classification Classification predicts categorical class label Prediction predicts continuous-valued.
A Presentation on Adaptive Neuro-Fuzzy Inference System using Particle Swarm Optimization and it’s Application By Sumanta Kundu (En.R.No.
An Introduction To The Backpropagation Algorithm.
VIDYA PRATISHTHAN’S COLLEGE OF ENGINEERING, BARAMATI.
Business Intelligence and Decision Support Systems (9 th Ed., Prentice Hall) Chapter 6: Artificial Neural Networks for Data Mining.
The Gradient Descent Algorithm
CSE 473 Introduction to Artificial Intelligence Neural Networks
Dr. Unnikrishnan P.C. Professor, EEE
CSE P573 Applications of Artificial Intelligence Neural Networks
Classification / Regression Neural Networks 2
Artificial Intelligence Methods
CSE 573 Introduction to Artificial Intelligence Neural Networks
An Introduction To The Backpropagation Algorithm
Hybrid intelligent systems:
CS621: Artificial Intelligence Lecture 22-23: Sigmoid neuron, Backpropagation (Lecture 20 and 21 taken by Anup on Graphical Models) Pushpak Bhattacharyya.
Presentation transcript:

Neuro-fuzzy Systems Xinbo Gao School of Electronic Engineering Xidian University 2004,10

Introduction Neuro-fuzzy systems Soft computing methods that combine in various ways neural networks and fuzzy concepts ANN – nervous system – low level perceptive and signal integration Fuzzy part – represents the emergent “higher level” reasoning aspects

Introduction “Fuzzification” of neural networks Endowing of fuzzy system with neural learning features

Introduction Co-operative - neural algorithm adapt fuzzy systems Off-line – adaptation On-line – algorithms are used to adapt as the system operates Concurrent – where the two techniques are applied after one another as pre- or post-processing Hybrid – fuzzy system being represented as a network structure, making it possible to take advantage of learning algorithm inherited from ANNs

Fuzzy Neural Networks Introduction of fuzzy concepts into artificial neurons and neural networks For example, while neural networks are good at recognizing patterns, they are not good at explaining how they reach their decisions. Fuzzy logic systems, which can reason with imprecise information, are good at explaining their decisions but they cannot automatically acquire the rules they use to make those decisions. These limitations have been a central driving force behind the creation of intelligent hybrid systems where two or more techniques are combined in a manner that overcomes individual techniques

Fuzzy Neurons Fuzzy model of artificial neuron can be constructed by using fuzzy operations at single neuron level x = (x 1,x 2,… x n ) w = (w 1,w 2,… w n ) y= g(w.x)

Fuzzy Neurons y = g(w.x) y = g(A(w,x)) Instead of weighted sum of inputs, more general aggregation function is used Fuzzy union, fuzzy intersection and, more generally, s-norms and t-norms can be used as an aggregation function for the weighted input to an artificial neuron

OR Fuzzy Neuron Transfer function g is linear If w k =0 then w k AND x k =0 while if w k =1 then w k AND x k = x k independent of x k y=OR(x 1 AND w 1, x 2 AND w 2 … x n AND w n ) OR:[0,1]x[0,1] n ->[0,1]

AND Fuzzy Neuron In the generalized forms based on t-norms, operators other than min and max can be used such as algebraic and bounded products and sums y=AND(x 1 OR w 1, x 2 OR w 2 … x n OR w n ) AND:[0,1]x[0,1] n ->[0,1]

Fuzzy Neurons Both the OR and the AND logic neurons are excitatory in character, i.e.  x k =>  y Issue of inhibitory (negative) weights deserves a short digression In the realm of fuzzy sets operations are defined in [0,1] Proper solution to make a weighted input inhibitory is to take fuzzy complement of the excitatory membership value  x = 1-x Input x=(x 1,..x n ) is extended to x=(x 1,…,x n,  x 1,…,  x n )

Fuzzy Neurons The weighted inputs x i o w i, where o is a t-norm and t- conorm, can be general fuzzy relations too, not just simple products as in standard neurons The transfer function g can be a non-linear such as a sigmoid

OR / AND Fuzzy Neuron This structure can produce a spectrum of intermediate behaviors that can be modified in order to suit a given problem If c 1 = 0 and c 2 = 1 the system reduces itself to pure AND neuron If c 1 = 1 and c 2 = 0 the behavior corresponds to that of a pure OR neuron Generalization of the above simple fuzzy neurons

Multilayered Fuzzy Neural Networks If we restrict ourselves to the pure two-valued Boolean case, network represents an arbitrary Boolean function as a sum of minterms More generally, if the values are continuous members of a fuzzy set then these networks approximate certain unknown fuzzy function A second possibility is to have OR neurons in the hidden layer and a single AND neuron in the output layer

Learning in Fuzzy Neural Networks Supervised learning in FNN consists in modifying their connection weights in a such a manner that an error measure is progressively reduced Its performance should remain acceptable when it is presented with new data Set of training data pairs (x k, d k ) for k=1,2..n w t+1 =w t +  w t, where weight change is a given function of difference between the target response d and calculated node output y  w t =F(|d t -y t |)

Learning in Fuzzy Neural Networks Mean square error E – measure of how well the fuzzy network maps input data into the corresponding output E(w) = ½  (d k -y k ) 2 Gradient descent  w i,j =

An Example: NEFPROX NEuro Fuzzy function apPROXimator Three-layer feedforward network (no cycles in the network and no connections exist between layer n and layer n+j, with j>1 input variables / hidden layer - fuzzy rules / output variables Hidden and output units use t-norms and t-conorms as aggregation functions The fuzzy sets are encoded as fuzzy connection weights and fuzzy inputs

NEFPROX The input units are labelled x 1..x n, hidden rule units are called R 1 …R k and the output units are denoted as y 1 y m Each connection is weighted with a fuzzy set and is labelled with a linguistic term Connection coming from the same input unit and having same label are weighted by the same common weight (shared weight). The same holds for the connections that lead to the same output unit There is no pair of rules with identical antecedents

NEFPROX – learning

A Second Example: The ANFIS System Adaptive Network-based Fuzzy Inference System Neuro-fuzzy system that can identify parameters by using supervised learning methods Sugeno-type fuzzy system with learning capabilities First order model Nodes have the same function for a given layer but are different from one layer to the next

ANFIS System

Learning algorithm is a hybrid supervised method based on gradient descent and Least-squares Forward phase: signals travel up to layer 4 and the relevant parameters are fitted by least squares Backward phase: the error signals travel backward and the premise parameters are updated as in backpropagation Fuzzy toolbox Matlab Mackey-Glass prediction / excellent non-linear fitting and generalization / less parameters and training time is comparable with ANN methods

ANFIS System Since a wide class of fuzzy controllers can be transformed into equivalent adaptive networks, ANFIS can be used for building intelligent controllers that is, controllers that can reason with simple fuzzy inference and that are able to learn from experience in the ANN style

Thanks for your attention! END