Download presentation
Presentation is loading. Please wait.
Published byBasil Fields Modified over 9 years ago
1
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh M. Hosseinajad ( J.Shanbehzadeh M. Hosseinajad)
2
DIGITAL IMAGE PROCESSING Dr J. Shanbehzadeh Shanbehzadeh@gmail.com M. Hosseinajad Part 2 – Neural Networks And Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
3
Road map of chapter 8 12.1 12.3 12.1 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods Patterns and Pattern Classes 12.2 12.3 Some Basic Compression Methods Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
4
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods MatchingOptimum Statistical ClassifiersNeural Networks ( J.Shanbehzadeh M. Hosseinajad)
5
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods Try to mimic the structure and function of our nervous system. Neuron: - A function unit of the nervous system. - Functioning through complex interconnection. - Functions in parallel. ( J.Shanbehzadeh M. Hosseinajad)
6
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Biological Neurons: Each neuron receives thousands of signal from other neurons. If the integrated signal exceeds a threshold, the cell fires and generates an action potential or spike. Artificial Neurons: A value of weight ‘w’ indicates the strength of the signal. Neuron B is stimulated if there is sufficient signal sent from neuron A.
7
Neural Networks Mathematic Models 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
8
Neural Networks Mathematic Models 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
9
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Perceptron is the simplest NN model consisting of a single neuron. Two main parameters: Weight Threshold Example: AND operation Perceptron
10
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) The rule which govern how exactly weights or thresholds are changed, is called as the learning algorithm. Different types of neural networks may have different learning algorithms: Fixed incremental correction rule Delta rule Gradient decent rule … Training
11
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Perceptron Learning Rule
12
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Perceptron Learning Rule
13
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Perceptron Learning Rule
14
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Perceptron Learning Rule Combine both rules w’ = w + α(r-o)x, w = α (r-o)x where 0< α <1 r = desired output o = output from the network
15
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Fixed Increment Correction For linearly separable classes: The weight is updated when an error occurs. case I: if y(k) ∈ w1 and w T (k)y(k) ≤0 w(k+1) = w(k) + cy(k), c > 0 case II: if y(k) ∈ w2 and w T (k)y(k) ≥0 w(k+1) = w(k) - cy(k), c > 0 case III: else w(k+1) = w(k)
16
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Nonseparable Classes The Delta Rule Can be used both with separable and non-separable classes. Main Idea: – Minimizes the error between the actual and desired response at any training step. – This can be achieved by using a technique called Gradient descent.
17
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Gradient Descent Gradient descent searches the space of weight vectors to find weights that best fit the training examples. The objective is to minimize the following error: E( w ) = ( ½ ) ∑( r – o )2 o = wy r : desired output o: output from the network The training is a process of minimizing the error E( w ) in the direction of the steepest most rapid decrease, that is in direction opposite to the gradient.
18
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Gradient Descent
19
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Learning Rate
20
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) How about XOR Problem?! Two planes are required for correct classification.
21
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Multi-layer NN
22
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Learning Rule
23
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) An Example
24
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
25
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad)
26
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Overfitting (overtraining): when the NN learns too many I/O examples it may end up memorizing the training data. The size of the training set. The architecture of the NN (too many hidden layers often causes overfitting problem). The complexity of the problem at hand. Since the algorithm attempts to minimize error, the algorithm can fall into any local minima. Problems
27
Neural Networks 12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Number of Layers The complexity of the decision surfaces can increase number of layers
28
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods Matching Shape NumbersString Matching Matching Shape Numbers ( J.Shanbehzadeh M. Hosseinajad)
29
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Matching Shape Numbers Structural Pattern Recognition
30
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Matching Shape Numbers The Order of Shape Number
31
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) Matching Shape Numbers The Order of Shape Number Similarity Degree (k) Similarity Degree (k) between two region boundaries is defined as the largest order for which their shape numbers still coincide. Distance Distance between shape a and b = D(a,b)= 1/k
32
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods Matching Shape NumbersString Matching ( J.Shanbehzadeh M. Hosseinajad)
33
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) String Matching Region a is coded into string denoted by a1 a2... an Region b is coded into string denoted by b1 b2 … bn The number of symbols that do not match is A similarity b/w region a and b is the ratio R
34
12.1 Patterns and Pattern Classes 12.2 Recognition Based on Decision-Theoretic Methods 12.3 Structural Methods ( J.Shanbehzadeh M. Hosseinajad) String Matching An Example
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.