Presentation is loading. Please wait.

Presentation is loading. Please wait.

Widrow-Hoff Learning (LMS Algorithm).

Similar presentations


Presentation on theme: "Widrow-Hoff Learning (LMS Algorithm)."— Presentation transcript:

1 Widrow-Hoff Learning (LMS Algorithm)

2 ADALINE Network w i 1 , 2 R =

3 Two-Input ADALINE

4 Mean Square Error Training Set: Notation: Mean Square Error: Input:
Target: Notation: Mean Square Error:

5 Error Analysis The mean square error for the ADALINE Network is a
quadratic function:

6 If R is positive definite:
Stationary Point Hessian Matrix: The correlation matrix R must be at least positive semidefinite. If there are any zero eigenvalues, the performance index will either have a weak minumum or else no stationary point, otherwise there will be a unique global minimum x*. If R is positive definite:

7 Approximate Steepest Descent
Approximate mean square error (one sample): Approximate (stochastic) gradient:

8 Approximate Gradient Calculation

9 LMS Algorithm

10 Multiple-Neuron Case Matrix Form:

11 Analysis of Convergence
For stability, the eigenvalues of this matrix must fall inside the unit circle.

12 Conditions for Stability
(where li is an eigenvalue of R) Since , . Therefore the stability condition simplifies to 1 2 a l i >

13 Steady State Response If the system is stable, then a steady state condition will be reached. The solution to this equation is This is also the strong minimum of the performance index.

14 Example Banana Apple

15 Iteration One Banana

16 Iteration Two Apple

17 Iteration Three

18 Adaptive Filtering Tapped Delay Line Adaptive Filter

19 Example: Noise Cancellation

20 Noise Cancellation Adaptive Filter

21 Correlation Matrix

22 Signals m k ( ) 1.2 2 p 3 - 4 – è ø æ ö sin = 1.2 ( ) 0.5 p 3 - è ø æ
cos 0.36 =

23 Stationary Point h E s k ( ) m + v [ ] 1 =

24 Performance Index

25 LMS Response

26 Echo Cancellation


Download ppt "Widrow-Hoff Learning (LMS Algorithm)."

Similar presentations


Ads by Google