Download presentation
Presentation is loading. Please wait.
Published byHugo Brooks Modified over 9 years ago
1
Fuzzy Pattern Recognition
2
Overview of Pattern Recognition Pattern Recognition Procedure Feature Extraction Feature Reduction Classification (supervised) Clustering (unsupervised or self-organizing) Performance Criteria Cluster Validity Class Label Known Unknown Clusters Speech/ Image /Data
3
Overview of Pattern Recognition Supervised Learning for Classification –The class label is known for a set of samples. –Find the decision boundary from the given samples. –For unknown data set, do classification Unsupervised Learning for Clustering –Set of data is given, find the group or grouping boundary Reinforcement Learning (Reward/Penalty) –Unkind teacher is given –Trial and Error Scheme
4
Overview of Pattern Recognition Classification and Clustering Class 1 Class 2 ? Classification Clustering Problem: How to partition How many clusters Problem: Which class to assign
5
Overview of Pattern Recognition Pattern Recognition Algorithm –Based on statistical approach Parametric Approach –Bay’es Classifier with Gaussian Density –Nonlinear Boundary or Decision Function Nonparametric Approach for Density Estimation –Parzen window –K-nearest method –Based on Neural Networks Classifier –Multilayer Perceptron, ART, Neocogntion, … Clustering –SOM(Self-Organizing Map)
6
Fuzzy Pattern Recognition Classification –Rule-Based Classifier –Fuzzy Perceptron –Fuzzy K-NN Algorithm Clustering –Fuzzy C-Mean –Possibilistic C-Mean –Fuzzy C-Shell Clustering –Fuzzy Rough Clustering Cluster Validity –Validity Measures Based on Fuzzy Set Theory
7
Fuzzy Pattern Recognition
8
Fuzzy Classification Rule-Based Classifier –Idea: Nonlinear Partition of Feature Space –How to find the rule from sample data. Project the labeled training data, and design membership functions Fuzzy clustering and projection to obtain membership function
9
Fuzzy Classification Fuzzy K-Nearest Neighbor Algorithm –Crisp K-NN Algorithm Class 1 Class 2 K = 3 Class 1 Class 2
11
Fuzzy Classification Fuzzy K-Nearest Neighbor Algorithm –Fuzzy K-NN Algorithm Class 1 Class 2
13
Fuzzy Nearest Prototype Classification Crisp and Fuzzy Nearest Prototype Classification Prototype of Class 1 Prototype of Class 2 Decision Boundary
14
Crisp Version Fuzzy Version
15
Fuzzy Perceptron Crisp Single-Layer Perceptron (Two-class problem) Find the linear decision boundary of separable data Linear Decision Boundary
17
Fuzzy Perceptron
18
Advantage –Generalize the crisp algorithm –Elegant termination in non-separable case –Crisp case: Not terminate in finite time
19
Fuzzy Perceptron Termination of FP –If misclassifications are all caused by very fuzzy data, then terminate the learning. Note: FP can be combined with kernel-based method. (J.H. Chen & C.S. Chen, IEEE Trans. On NNs, 2002)
20
Fuzzy C-Mean Clustering Objective –The aim of the iterative algorithm is to decrea se the value of an objective function Notations –Samples –Prototypes –L2-distance:
21
Fuzzy C-Mean Crisp objective: Fuzzy objective
22
Fuzzy C-Mean Crisp C-Mean Algorithm –Initiate k seeds of prototypes p 1, p 2, …, p k –Grouping: Assign samples to their nearest prototypes Form non-overlapping clusters out of these samples –Centering: Centers of clusters become new prototypes –Repeat the grouping and centering steps, until convergence
23
Fuzzy C-Mean Crisp C-Mean Algorithm –Grouping: Assigning samples to their nearest prototypes helps to decrease the objective –Centering: Also helps to decrease the above objective, because and equality holds only if
24
Fuzzy C-Mean Membership matrix: U c×n –U ij is the grade of membership of sample j wit h respect to prototype i Crisp membership: Fuzzy membership:
25
Fuzzy C-Mean Objective function of FCM Introducing the Lagrange multiplier λ with respect to the constraint the objective function as:
26
Fuzzy C-Mean Setting the partial derivatives to zero, From the 2 nd equation, From this fact and the 1 st equation,
27
Fuzzy C-Mean Therefore, updating rule is
28
Fuzzy C-Mean Setting the derivative of J with respect to p i to zero,
29
Fuzzy C-Mean Update rule of c i : To summarize:
30
Fuzzy C-Mean K-meansFuzzy c-means
31
Fuzzy C-Mean
32
Gustafson-Kessel Algorithm
33
Cluster Validity to Determine Number of Clusters
34
Extraction of Rule Base from Fuzzy Cluster
35
Possibilistic C-Mean Problem of FCM –Equal Evidence = Ignorance
36
Possibilistic C-Mean Objective Function of Fuzzy C-Mean –Constraint from Ruspini: Sum of membership of a datum over all classes should be 1. –Too restrictive condition for noisy data Objective Function of PCM –Minimize intra-cluster distance –Make membership as large as possible
37
Possibilistic C-Mean Necessary Condition Determination of –Average cluster distance –Based on alpha-cut
38
Possibilistic C-Mean Membership according to
39
Possibilistic C-Mean Cluster Centers Inner Product –Gustafson-Kessel (See previous page) –Spherical shell cluster
40
Possibilistic C-Mean 2-Pass Algorithm: –Initialize PC Partition –DO Until (Change in PC Partition is Small) Update Prototype Update PC Partition using average cluster distances –Based on the resulted PC Partition –DO Until (Change in PC Partition is Small) Update Prototype Update PC Partition using alpha-cut distances
41
Possibilistic C-Mean Advantage –Robust to noisy data –Possibly good to get the fuzzy rule base FCM-Based C-ShellPCM-Based C-Shell
42
Other Notion of Distance –Weights on features –Optimal Weights
43
Other Notion of Distance FCM with Euclidian DistanceFCM with Adaptive Distance
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.