Presentation is loading. Please wait.

Presentation is loading. Please wait.

Lyapunov Functions and Memory Justin Chumbley. Why do we need more than linear analysis? What is Lyapunov theory? – Its components? – What does it bring?

Similar presentations


Presentation on theme: "Lyapunov Functions and Memory Justin Chumbley. Why do we need more than linear analysis? What is Lyapunov theory? – Its components? – What does it bring?"— Presentation transcript:

1 Lyapunov Functions and Memory Justin Chumbley

2 Why do we need more than linear analysis? What is Lyapunov theory? – Its components? – What does it bring? Application: episodic learning/memory

3

4 Linearized stability of non-linear systems: Failures Is there a steady state under pure imaginary eigenvalues? – theorem 8 doesn’t say Size/Nature of Basin of attractions? – cf a small neighborhood of the ss (linearizing) Lyapunov – Geometric interpretation of state-space trajectories

5 Important geometric concepts (in 2d for convenience) State function – scalar function U of system with continuous partial derivatives – A landscape Define a landscape with steady state at the bottom of a valley

6 Positive definite state function

7 e.g. Unique singular point at 0 Not unique U

8 * U defines the valley – Do state trajectories travel downhill? Temporal change of pd state function along trajectories? – Time implicit in U

9 e.g. N-dim case

10 Lyapunov functions and asymptotic stability Intuition – Water down a valley  all trajectories in a neighborhood approach singular point as

11

12 satisfies a

13 Ch 8 Hopf bifurcation Van der Pol model for a heart-beat – Analyzed at bifurcation point (where linearized eigenvalues are purely imaginary) – At this point…  (0,0) is the only steady state  Linearized analysis can’t be applied (pure imaginary eigs) – But: pd state function has time derivates along trajectories

14 satisfies b So – Except on x,y axes where – But when x = 0 then  trajectories will move to points where -So U is a Lyapunov function for … -Ss at (0,0) is asymptotically stable Conclusion: have proven stability where linearization fails

15 Another failure of Theorem 8 Points ‘sufficiently close’ to asymptotically stable steady state go there as But U defines ALL points in the valley in which the ss lies! – Intuition: any trajectory starting within the valley flows to ss.

16 Formally many steady and basins – Assume we have U for It delimits a region R within which theorem 12 holds  A constraint U<K defines a subregion within the basin

17

18

19 Where does U come from? No general rule. Another e.g. divisive feedback

20 *

21 Memory Declarative – Episodic – Semantic Procedural …

22 Episodic memory (then learning)

23

24 m 16*16 pyramidal – Completely connected but not self-connected 1 for feedback inhibition

25 Aim – Understand generalization/discrimination Strategy – Input in the basin will be ‘recognized’ i.e. identified with the stored pattern (asympotically) – Lyapunov theory  assess basins of attraction Notation: etc…

26 Theorem 14

27

28 For reference Can be generalized to higher order

29 s ,

30 Pattern recognition (matlab)

31 Hebb Rule Empirical results – Implicate cortical and hippocampal NMDA – 100-200ms window for co-occurance – Presynaptic Glu and Postsynaptic depolarisation by backpropogation from postsynaptic axon (Mg ion removal).  Chemical events change synapse

32 For simplicity… M = max firing rate – (both pre and post must be firing higher than half maximum) Synapse changes to fixed k when modified Irreversible synaptic change All pairs symmetrically coupled 

33 Learning (matlab) One stimuli Multiple stimuli

34 Pros and limitations of Lyapunov theory More general stability analysis Basins of attraction Elegance and power No algorithm for getting U Not unique U: each gives lower bound on basin

35


Download ppt "Lyapunov Functions and Memory Justin Chumbley. Why do we need more than linear analysis? What is Lyapunov theory? – Its components? – What does it bring?"

Similar presentations


Ads by Google