Presentation is loading. Please wait.

Presentation is loading. Please wait.

Latching forward.

Similar presentations


Presentation on theme: "Latching forward."— Presentation transcript:

1 Latching forward

2 The statistical/modular perspective
The Braitenberg model: glocal associative memory N pyramidal cells √N compartments √N cells each A pical synapses B asal synapses Rafi Malach said it already

3 Rafi Malach said it already word

4 Potts units with dilute connectivity
S+1 Potts states Sparse Potts patterns Reduced to a Potts model (Kropff & Treves, 2005) Structured long-range connectivity “0” state included Sparse global patterns updated to remove the ‘memory glass’ problem (Fulvi Mari & Treves, 1998) Cortical modules Local attractor states (=S) Global activity patterns A simple semantic network (O’Kane & Treves, 1992) ..but all cortical modules share the same organization… pc  C S 2 !! pc  S ?!?!

5 Potts version of the Hopfield model, with N units
Iddo Kanter (1988) Potts-glass models of neural networks. Phys Rev A 37: Potts version of the Hopfield model, with N units S states pc  N S (S-1)/ 2 p patterns (2 log2 S ) I /p = N log2 S pc  C S (S-1)/(4 ln S ) + dilute connectivity (C connections/unit) I /p  Na log2 S/a pc  C S (S-1)/(4a ln S/a) (Kropff & Treves, 2005) + sparse coding (fraction 1-a of units in “0” state)

6 Simulations which include a model of neuronal fatigue Simulations
show that the Potts semantic network can hop from global attractor to global attractor: Latching dynamics

7 Moshe, Tali et al had seen something similar in monkey recordings

8 May latching dynamics model a train of thoughts ?
(cited by Alistair Knott in his PhD Thesis, 1996)

9 Infinite recursion hypothesis by Hauser, Chomsky & Fitch

10 Herb Jaeger just said it Embeddings: N1V1N2V2N3V3N4V4
..that John saw Peter help Mary make the children swim… N1V1N2V2N3V3N4V4 ..the realization that the talk they attend has not yet finished makes them weep… N1N2N3V3V2V1 (palindrome, or stack memory) ..dat Jan Piet Marie de kinderen zag helpen laten zwemmen… N1N2N3N4V1V2V3V4 (interleaving, or queue memory) Finite recursion, it seems… Embeddings: Herb Jaeger just said it Language is therefore based on a recursive generative procedure that takes elementary word-like elements from some store, call it the lexicon, and applies repeatedly to yield structured expressions, without bound […] Notice that there is no room in this picture for any precursors to language – say a language-like system with only short sentences. There is no rationale for postulation of such a system: to go from seven-word sentences to the discrete infinity of human language requires emergence of the same recursive procedure as to go from zero to infinity, and there is of course no direct evidence for such “protolanguages.” NC, May 14, 2008

11 The fluctuation-dissipation theorem
applied to working memory correlations among memories fluctuate memory dissipates max embedding depth digit span P(cμν)≈[1/√(2π)]exp-(cμν/σ)2/2 (with a multivariate Gaussian distribution) ? √2 _______________________________ σ ln(p/√2π) ( ) x 2 ≈ 7 ± 2 3 or 4 assuming binding by synaptic facilitation

12 if transition probabilities are structured,
Latching dynamics, if transition probabilities are structured, might be a neural model for infinite recursion (see memory model by Herrmann, Ruppin & Usher, 1993, + Epstein, 2000, Burgess & Shallice, 1996, Wennekers, Gros…)

13 λ Latching may be a neural model for infinite recursion only if
transition probabilities are structured, so that dynamics are neither random not deterministic λ Emilio Kropff has looked into that (EK & AT, Nat Comput, 2007) determ rand

14 Then, Eleonora Russo found how to describe
The Model Then, Eleonora Russo found how to describe latching dynamics analytically, by integrating a system of 28 differential equations…. We follow the dynamics of the network by the means of the OVERLAP parameter. Field applied to the unit in the direction of the kth state: Activity of unit i in the kth state: metti anche la parte analitica delle varie equazioni differenziali e specifica il discorso su i due tempi scala e la media quenched ecc... Overlap of the system with the th pattern:

15 PATHOLOGICAL LATCHING
A Simplified Case LATCHING TRANSITIONS (with the Potts model): HIGH LATCHING history-dependent transitions between attractors with stronger correlations PATHOLOGICAL LATCHING oscillations between pairs of closely overlapping attractors LOW LATCHING transitions between weakly correlated attractors Localized together - Correlated C ≈ a/S Correlated

16 No new latching types are found.
A Simplified Case DYNAMICAL PARAMETERS varying : speed of the change in field of active states No new latching types are found. In some range of this parameter neither pathological nor low latching exist.

17 A Simplified Case A First Conclusion: three types of latching transitions are present for different correlation levels: in some regions of the dynamical parameter space not all the three transition types are present. transitions between weakly correlated attractors, history-dependent transitions between attractors with a stronger correlation, oscillations between pairs of closely overlapping attractors.

18 A Complex Landscape Increasing the mean correlation, the length of the latching series grows: CORRELATION

19 A Complex Landscape LENGTH With a small variation in the mean correlation of patterns, the latching sequence becomes much longer.

20 Systematic simulations indicate a latching phase transition
pl pl

21 e.g. that in a multi-factor coding model (with correlated patterns)
pc  C S 2 ? Back to the analysis, to confirm the crucial quantitative relationships, e.g. that in a multi-factor coding model (with correlated patterns) + pl  S ? (see EK & AT, HFSP Journal, 2007)

22 % (C/N)

23 How might have a capacity for indefinite latching evolved?
semantics  semantics  AM AM C S long-range conn  (local conn ) Storage capacity (max p to allow cued retrieval) pc  C S 2 Latching onset (min p to ensure recursive process) pl  S ? a spontaneous transition to infinite recursion?

24 G Elston et al

25

26 Retrieval and latching appear to coexist
only above critical values of both C and S Is that to FLNs a percolation phase transition? Too many memories Too many memories Too few latching transitions Too few latching transitions

27 nc σ ln(p/√2π) To go beyond statistics into syntax,
a binding operation is required, with working memory differentiation implies multiple scaling relations “operators” self-organized input driven pc  C S 2 “fillers” pl  S ? nc √2 _______________________________ σ ln(p/√2π) “fillers” (see Battaglia et al, Eliasmith et al, Fusi et al, Huyck, Hashimoto, ..)

28 ? ≠ BLISS (w/ E Kropff, A Grüning, M Katran Syntactic structure
Medium Term: to assess how any model can learn, one first needs a toy: a basic language including both syntax and semantics BLISS (w/ E Kropff, A Grüning, M Katran with mild guidance by G Longobardi) Syntactic structure S  DP I’ (singular/plural) I’  I VP I  (singular or plural form) VP  (Neg) (AdvP) V’ V’  V DP | V S’ | V PP PP  Prep DP S’  Compl S DP  Det NP | PropN NP  (AdjP) N” N”  (AdjP) N’ N’  N (PP) | N S’ Det  Art | Dem + semantic correlations N  boy | girl | cat | dog | tiger | jackal | horse | cow | meat | hay | milk | wood | meadow | stick | fork | bowl | cart | table | house || boys | girls | cats | dogs | tigers | jackals | horses | cows | stables | sticks | forks | bowls | carts | tables | houses PropN  John | Mary || John and Mary V  chases | feeds | sees | hears | walks | lives | eats | dies | kills | brings | pulls | is || chase | feed | see | hear | walk | live | eat | die | kill | bring | pull | are | Compl  that | whether Prep  in | with | to | of | under Neg  does not || do not (note singular negation removes sing inflection of verb) Art  the | a(an) AdjP  red | blue | green | black | brown | white | yellow | slow | fast | rotten | fresh | cold | warm | hot AdvP  slowly | rapidly | close | far Dem  this | that || these | those ? VP  (Neg) (AdvP) V’ V’  V DP | V S’ | V PP PP  Prep DP

29 Hope for a simplified abstract formulation:
G. Longobardi et al reduce 49 DP-internal parameters to just 4 parameter schemata:

30 BLISS: a first version implements only syntax
S1 -> DP11 VP11 | DP12 VP12 probability , 0.8 VP11 -> VR11 | Neg11 VR probability , VP12 -> VR12 | Neg12 VR probability , VR11 -> Vt11 DP | Vi11 | Vi11 PPV | Vtd11 SRd | Vti11 SRi | Vtdtv11 DP PPT probability 0.25, 0.05, 0.15, 0.2, 0.2, 0.15 VR12 -> Vt12 DP | Vi12 | Vi12 PPV | Vtd12 SRd | Vti12 SRi | Vtdtv12 DP PPT PP -> Prep DP11 | Prep DP12 probability 0.5, 0.5 PPV -> PrepV DP11 | PrepV DP12 probability 0.5, 0.5 PPT -> PrepT DP11 | PrepT DP12 probability 0.5, 0.5 SRd -> Conjd S1 probability 1 SRi -> Conji S1 probability 1 DP -> DP11 | DP probability , 0.25 DP11 -> Det11 NP11 | PropN11 probability , 0.2 DP12 -> Det12 NP12 | NP12 | PropN12 probability , 0.5 , 0.005 NP11 -> N11 | AdjP N11 | N11 PP | AdjP N11 PP probability 0.4 , 0.2, 0.2, 0.2 NP12 -> N12 | AdjP N12 | N12 PP | AdjP N12 PP probability 0.4 , 0.2, 0.2, 0.2 Det11 -> Art11 | Dem11 probability , 0.17 Det12 -> Art12 | Dem12 probability , 0.17 N11 -> man | woman | boy | girl | child | book | friend | mother | table | house | paper | letter | teacher| parent | window | wife | bed | cow | tree | garden | hotel | lady | cat | dog | horse | brother | husband | daughter | meat | milk | wood | fork | bowl | cart | farm | kitchen probability , , , , , , , , , , , , , , , , ., Sahar

31 the red paper sits with hot cats on the strong windows
wonderful friends don't sit for a short kitchen the hot houses on cows don't find John teachers find a horse those rotten pens in the cow live in a meat the dogs don't wonder whether pens of the meat of that girl of the house walk with long farms strong mothers give tables in that black meat to the red carts good boys hear Phoebe girls die for a strange dog kitchens with Mary give great husbands to the good houses these papers in mothers send the mother on the bed to the dogs in houses short dogs on farms of the hotels give Phoebe to the parents a cold lady with the women wonders whether Joe returns in a strange woman with the friends teachers of the man know whether these trees sit in the wonderful cat with tables the husbands come in friends girls with John find Joe Joe uses a old fork of the great beds of the parents in the strange bed of children of the garden John wonders whether the parents on trees with the bowls don't give a cart strange wives run in tables John sends Mary to Phoebe men send the hot dog to the horse

32 How much creativity constrained is syntactic dynamics? We ask that
with BLISS.. memory (not working) …we take the same measure from latching POTTS nets carabinieri


Download ppt "Latching forward."

Similar presentations


Ads by Google