Download presentation
Presentation is loading. Please wait.
Published byBrett Nicholson Modified over 9 years ago
1
Syntactic Language Processing through Hierarchical Heteroclinic Networks Workshop on Heteroclinic Dynamics in Neuroscience University of Nice December, 18, 2015 Peter beim Graben Bernstein Center for Computational Neuroscience Humboldt-Universität zu Berlin peter.beim.graben@hu-berlin.de
2
Outline garden path sentences limited repair parsing sequence generation in neural networks limited repair through bifurcations conclusions outlook
3
+
4
the
5
horse
6
raced
7
past
8
the
9
barn
10
fell.
11
Garden Path Sentences Bever (1970)
12
Garden Path Theory Frazier & Rayner (1982) Frazier (1987) late closure
13
Diagnosis Fodor & Inoue (1994) the horse raced past the barn fell. the horse raced past. the barn fell. ? lexicon: raced = (1) finite verb, past tense raced = (2) non-finite verb, participle past perfect symptom
14
Experimental Findings self-paced reading: slowing down eye-tracking: regressions, longer fixations event-related brain potentials: P600 beim Graben et al. (2008) Schwappach et al. (2015)
15
Garden Path Variants heavy garden paths: the horse raced past the barn fell. mild garden paths: Tad knows Shaq is tall. Bever (1970) Lewis (1998)
16
Limited Repair Parsing Lewis (1998) Tad knows Shaq. (default) Tad knows Shaq is tall. snip
17
Limited Repair Parsing Lewis (1998) Tad knows Shaq. (default) Tad knows Shaq is tall. link
18
Decision Space Tad knows Shaq 1243 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 Tad knows Shaq is tall Lewis (1998) Hale (2011) garden path limited repair reduced relative
19
parser states are “vertices” in representation space State Descriptions attraction repulsion
20
Dynamics fixed point attractorsaddle identify parser states with equilibrium points in representation space stable directions unstable directions
21
Garden Path Dynamics unwanted attractor
22
Repair by Bifurcation unwanted attractor beim Graben et al. (2004)
23
Neural Network Architecture hierarchy of 3 levels of generalized Lotka-Volterra systems: 1 st level “localist” representation of 24 parser states 2 nd level 3 parsing strategies 3 rd level 3 attention parameters sequence generation through winnerless competition at levels 1, 2 garden path as undesired attractor repair through bifurcation elicited by decay of attention Afraimovich et al. (2004) Fukai & Tanaka (1997) Kiebel et al. (2009) Haken (1991) 1 2 4 3 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 st 2 nd 3 rd
24
Neural Network Architecture Afraimovich et al. (2004) Fukai & Tanaka (1997) Kiebel et al. (2009) Haken (1991) 1 2 4 3 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 1 st 2 nd 3 rd
25
Results garden path limited repair reduced relative stabilization
26
Results garden path limited repair reduced relative * sympton destabilization
27
Results garden path limited repair reduced relative transition
28
Results garden path limited repair reduced relative stabilization
29
Results garden path limited repair reduced relative destabilization
30
Results garden path limited repair reduced relative transition
31
Dynamical Automata symbologram representation of automata states order parameter expansion open problems: determinism every pair of rationals is representational state infinite-dimensionality
32
Conclusions syntactic language parsing is realized through sequential winnerless competition in neural population networks processing strategies and their transitions are realized through control parameters provided by higher levels in a network hierarchy processing strategies are stabilized and destabilized by attention parameters provided by even higher levels in a network hierarchy limited repair of syntactic garden paths is realized through decaying attention and subsequent bifurcations
33
Outlook neurophysiological evidence? neural population models? continuous time neural dynamics? automata theoretic description of limited repair parsing: nested stack automata and index grammars?
34
Thank you for your attention! Funding: DFG Heisenberg Fellowship Acknowledgements
35
Context-Free Grammars Tad knows Shaq. (default) beim Graben et al. (2004)
36
Context-Free Grammars Tad knows Shaq is tall. beim Graben et al. (2004)
37
Interactive Left-Corner Parsing 1Tadknows(1)scan(Shaq) 2Tadknows Shaq(2)project 3S[VP1]knows Shaq(3)shift(V1) 4S[VP1]knowsShaq(4)scan(is) 5S[VP1]knowsShaq is(9)project 6S[VP1]VP1[NP2]Shaq is(19)shift(NP2) 7S[VP1]VP1[NP2]Shaqis(20)scan(tall) 8S[VP1]VP1[NP2]Shaqis tall(21)shift(V2) 9S[VP1]VP1[NP2]Shaq istall(22)fail 10S[VP1]VP1[NP2]Shaq istall timestackwm labeloperation beim Graben et al. (2008) beim Graben & Potthast (2012) garden path
38
Interactive Left-Corner Parsing 10S[VP1]VP1[NP2]Shaq istall(22)snip 11S[VP1]VP1[NP2] Shaq is tall(23)repair [NP2] [CP] 12S[VP1]VP1[CP] Shaq is tall(24)shift(NP2) 13S[VP1]VP1[CP]Shaqis tall(12)cont. timestackwm labeloperation beim Graben et al. (2008) limited repair
39
Interactive Left-Corner Parsing 13S[VP1]VP1[CP]Shaqis tall(12)project 14S[VP1]VP1[CP]CP[VP2]is tall(13)shift(V2) 15S[VP1]VP1[CP]CP[VP2]is tall(14)project(4) 16 S[VP1]VP1[CP]CP[VP2]VP2[A]tall(15)shift(A) 17 S[VP1]VP1[CP]CP[VP2]VP2[A]Aε(16)complete 18 S[VP1]VP1[CP]CP[VP2]VP2ε(17)complete 19S[VP1]VP1[CP]CPε(18)complete 20S[VP1]VP1ε(7)complete 21Sε(8)accept timestackwm labeloperation reduced relative
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.