Download presentation
Presentation is loading. Please wait.
Published byAshley Hopkins Modified over 9 years ago
2
Jakobson's Grand Unified Theory of Linguistic Cognition Paul Smolensky Cognitive Science Department Johns Hopkins University Elliott Moreton Karen Arnold Donald Mathis Melanie Soderstrom Géraldine Legendre Alan Prince Peter Jusczyk Suzanne Stevenson with:
3
Grammar and Cognition 1.What is the system of knowledge? 2.How does this system of knowledge arise in the mind/brain? 3.How is this knowledge put to use? 4.What are the physical mechanisms that serve as the material basis for this system of knowledge and for the use of this knowledge? (Chomsky ‘88; p. 3)
4
Advertisement The complete story, forthcoming (2003) Blackwell: The harmonic mind: From neural computation to optimality-theoretic grammar Smolensky & Legendre
5
A Grand Unified Theory for the cognitive science of language is enabled by Markedness : Avoid α ① Structure Alternations eliminate α Typology: Inventories lack α ② Acquisition α is acquired late ③ Processing α is processed poorly ④ Neural Brain damage most easily disrupts α Jakobson’s Program Formalize through OT? OT ① ③ ④ ②
6
StructureAcquisitionUseNeural Realization Theoretical. OT (Prince & Smolensky ’91, ’93) : –Construct formal grammars directly from markedness principles –General formalism/ framework for grammars: phonology, syntax, semantics; GB/LFG/… –Strongly universalist: inherent typology Empirical. OT: –Allows completely formal markedness- based explanation of highly complex data /
7
Theoretical Formal structure enables OT-general: – Learning algorithms Constraint Demotion : Provably correct and efficient (when part of a general decomposition of the grammar learning problem) – Tesar 1995 et seq. –Tesar & Smolensky 1993, …, 2000 Gradual Learning Algorithm – Boersma 1998 et seq. StructureAcquisitionUseNeural Realization ® Initial state Empirical –Initial state predictions explored through behavioral experiments with infants
8
StructureAcquisitionUseNeural Realization Theoretical –Theorems regarding the computational complexity of algorithms for processing with OT grammars Tesar ’94 et seq. Ellison ’94 Eisner ’97 et seq. Frank & Satta ’98 Karttunen ’98 Empirical (with Suzanne Stevenson ) –Typical sentence processing theory: heuristic constraints –OT: output for every input; enables incremental (word-by-word) processing –Empirical results concerning human sentence processing difficulties can be explained with OT grammars employing independently motivated syntactic constraints –The competence theory [OT grammar] is the performance theory [human parsing heuristics]
9
Empirical StructureAcquisitionUseNeural Realization Theoretical OT derives from the theory of abstract neural (connectionist) networks –via Harmonic Grammar ( Legendre, Miyata, Smolensky ’90) For moderate complexity, now have general formalisms for realizing –complex symbol structures as distributed patterns of activity over abstract neurons –structure-sensitive constraints/rules as distributed patterns of strengths of abstract synaptic connections –optimization of Harmony Construction of a miniature, concrete LAD
10
Program Structure OT Constructs formal grammars directly from markedness principles Strongly universalist: inherent typology OT allows completely formal markedness-based explanation of highly complex data Acquisition Initial state predictions explored through behavioral experiments with infants Neural Realization Construction of a miniature, concrete LAD
11
The Great Dialectic Phonological representations serve two masters Phonological Representation Lexico n Phonetic s Phonetic interface [surface form] Often: ‘minimize effort (motoric & cognitive) ’; ‘maximize discriminability’ Locked in conflict Lexical interface /underlying form/ Recoverability: ‘match this invariant form’ F AITHFULNESS M ARKEDNESS
12
OT from Markedness Theory M ARKEDNESS constraints: *α: No α F AITHFULNESS constraints – F α demands that /input/ [output] leave α unchanged (McCarthy & Prince ’95) – F α controls when α is avoided (and how ) Interaction of violable constraints: Ranking –α is avoided when *α ≫ F α –α is tolerated when F α ≫ *α – M 1 ≫ M 2 : combines multiple markedness dimensions
13
OT from Markedness Theory M ARKEDNESS constraints: *α F AITHFULNESS constraints: F α Interaction of violable constraints: Ranking –α is avoided when *α ≫ F α –α is tolerated when F α ≫ *α – M 1 ≫ M 2 : combines multiple markedness dimensions Typology: All cross-linguistic variation results from differences in ranking – in how the dialectic is resolved (and in how multiple markedness dimensions are combined)
14
OT from Markedness Theory M ARKEDNESS constraints F AITHFULNESS constraints Interaction of violable constraints: Ranking Typology: All cross-linguistic variation results from differences in ranking – in resolution of the dialectic Harmony = M ARKEDNESS + F AITHFULNESS –A formally viable successor to Minimize Markedness is OT’s Maximize Harmony (among competitors)
15
Structure Explanatory goals achieved by OT Individual grammars are literally and formally constructed directly from universal markedness principles Inherent Typology : Within the analysis of phenomenon Φ in language L is inherent a typology of Φ across all languages
16
Program Structure OT Constructs formal grammars directly from markedness principles Strongly universalist: inherent typology OT allows completely formal markedness-based explanation of highly complex data --- Friday Acquisition Initial state predictions explored through behavioral experiments with infants Neural Realization Construction of a miniature, concrete LAD
17
Structure: Summary OT builds formal grammars directly from markedness: M ARK, with F AITH Friday: Inventories consistent with markedness relations are formally the result of OT with local conjunction Even highly complex patterns can be explained purely with simple markedness constraints: all complexity is in constraints’ interaction through ranking and conjunction: Lango ATR vowel harmony
18
Program Structure OT Constructs formal grammars directly from markedness principles Strongly universalist: inherent typology OT allows completely formal markedness-based explanation of highly complex data Acquisition Initial state predictions explored through behavioral experiments with infants Neural Realization Construction of a miniature, concrete LAD
19
Nativism I: Learnability Learning algorithm – Provably correct and efficient (under strong assumptions) –Sources: Tesar 1995 et seq. Tesar & Smolensky 1993, …, 2000 –If you hear A when you expected to hear E, increase the Harmony of A above that of E by minimally demoting each constraint violated by A below a constraint violated by E
20
in + possible Candidates Faith Mark (NPA) ☹ ☞ E☹ ☞ E i np ossible * A i m possible * Faith * ☺ ☞☺ ☞ If you hear A when you expected to hear E, increase the Harmony of A above that of E by minimally demoting each constraint violated by A below a constraint violated by E Constraint Demotion Learning Correctly handles difficult case: multiple violations in E
21
Nativism I: Learnability M ≫ F is learnable with /in+possible/→impossible –‘not’ = in- except when followed by … –“exception that proves the rule, M = NPA” M ≫ F is not learnable from data if there are no ‘exceptions’ (alternations) of this sort, e.g., if lexicon produces only inputs with mp, never np : then M and F, no M vs. F conflict, no evidence for their ranking Thus must have M ≫ F in the initial state, ℌ 0
22
The Initial State OT-general: M ARKEDNESS ≫ F AITHFULNESS Learnability demands (Richness of the Base) (Alan Prince, p.c., ’93; Smolensky ’96a) Child production: restricted to the unmarked Child comprehension: not so restricted (Smolensky ’96b)
23
Nativism II: Experimental Test Collaborators Peter Jusczyk Theresa Allocco Language Acquisition ( 2002)
24
Nativism II: Experimental Test Linking hypothesis: More harmonic phonological stimuli ⇒ Longer listening time More harmonic: – M ≻ * M, when equal on F – F ≻ * F, when equal on M –When must chose one or the other, more harmonic to satisfy M: M ≫ F M = Nasal Place Assimilation (NPA)
25
X / Y / XY paradigm (P. Jusczyk) un...b ...umb Experimental Paradigm p =.006 um...b ...umb um...b ...iŋgu iŋ…..gu...iŋgu vs. iŋ…..gu…umb … ∃ F AITH Headturn Preference Procedure (Kemler Nelson et al. ‘95; Jusczyk ‘97) Highly general paradigm: Main result ℜ * F NP
26
4.5 Months (NPA) Higher HarmonyLower Harmony um…ber… umber um…ber… iŋgu p =.006 (11/16)
27
Higher HarmonyLower Harmony um…ber…u mb erun…ber…u nb er p =.044 (11/16) 4.5 Months (NPA)
28
Markedness * Faithfulness * Markedness Faithfulness u n …ber…u mb eru n …ber…u nb er ???
29
4.5 Months (NPA) Higher HarmonyLower Harmony u n …ber…u mb eru n …ber…u nb er p =.001 (12/16)
30
Program Structure OT Constructs formal grammars directly from markedness principles Strongly universalist: inherent typology OT allows completely formal markedness-based explanation of highly complex data Acquisition Initial state predictions explored through behavioral experiments with infants Neural Realization Construction of a miniature, concrete LAD
31
The question The nativist hypothesis, central to generative linguistic theory: Grammatical principles respected by all human languages are encoded in the genome. Questions: –Evolutionary theory: How could this happen? –Empirical question: Did this happen? –Today: What — concretely — could it mean for a genome to encode innate knowledge of universal grammar?
32
UGenomics The game: Take a first shot at a concrete example of a genetic encoding of UG in a Language Acquisition Device ¿ Proteins ⇝ Universal grammatical principles ? Time to willingly suspend disbelief …
33
UGenomics The game: Take a first shot at a concrete example of a genetic encoding of UG in a Language Acquisition Device ¿ Proteins ⇝ Universal grammatical principles ? Case study: Basic CV Syllable Theory (Prince & Smolensky ’93) Innovation: Introduce a new level, an ‘abstract genome’ notion parallel to [and encoding] ‘abstract neural network’
34
GrammarInnate Constraints Abstract Neural NetworkAbstract Genome Biological Neural Network Biological Genome = A instantiates B = A encodes B Approach: Multiple Levels of Encoding
35
UGenome for CV Theory Three levels –Abstract symbolic:Basic CV Theory –Abstract neural: CVNet –Abstract genomic: CVGenome
36
UGenomics: Symbolic Level Three levels – Abstract symbolic:Basic CV Theory –Abstract neural: CVNet –Abstract genomic: CVGenome
37
GrammarInnate Constraints Abstract Neural NetworkAbstract Genome Biological Neural Network Biological Genome = A instantiates B = A encodes B Approach: Multiple Levels of Encoding
38
Basic syllabification: Function Basic CV Syllable Structure Theory –‘Basic’ — No more than one segment per syllable position:.(C)V(C). ƒ: /underlying form/ [surface form] /CVCC/ [.CV.C V C.] /pæd+d/ [pæd d] Correspondence Theory –McCarthy & Prince 1995 (‘M&P’) /C 1 V 2 C 3 C 4 / [.C 1 V 2.C 3 V C 4 ]
39
Why basic CV syllabification? ƒ: underlying surface linguistic forms Forms simple but combinatorially productive Well-known universals; typical typology Mini-component of real natural language grammars A (perhaps the ) canonical model of universal grammar in OT
40
P ARSE : Every element in the input corresponds to an element in the output O NSET : No V without a preceding C etc. Syllabification: Constraints (Con)
41
UGenomics: Neural Level Three levels –Abstract symbolic:Basic CV Theory – Abstract neural: CVNet –Abstract genomic: CVGenome
42
GrammarInnate Constraints Abstract Neural NetworkAbstract Genome Biological Neural Network Biological Genome = A instantiates B = A encodes B Approach: Multiple Levels of Encoding
43
CVNet Architecture /C 1 C 2 / [C 1 V C 2 ] C V / C 1 C 2 / [ C 1 V C 2 ] ‘1’ ‘2’
44
Connection substructure Local: fixed, gene- tically determined Content of constraint 1 Global: variable during learning Strength of constraint 1 1 s1s1 2 i s2s2 Network weight: Network input: ι = W Ψ a
45
P ARSE C V 33 33 33 33 33 33 11 11 11 11 11 11 33 33 33 33 33 33 33 33 33 33 33 33 All connection coefficients are +2
46
O NSET All connection coefficients are 1 C V
47
Crucial Open Question (Truth in Advertising) Relation between strict domination and neural networks?
48
CVNet Dynamics Boltzmann machine/Harmony network –Hinton & Sejnowski ’83 et seq. ; Smolensky ‘83 et seq. –stochastic activation-spreading algorithm: higher Harmony more probable –CVNet innovation: connections realize fixed symbol-level constraints with variable strengths –learning: modification of Boltzmann machine algorithm to new architecture
49
Learning Behavior A simplified system can be solved analytically Learning algorithm turns out to ≈ s i ( ) = [# violations of constraint i P ]
50
UGenomics: Genome Level Three levels –Abstract symbolic:Basic CV Theory –Abstract neural: CVNet – Abstract genomic: CVGenome
51
GrammarInnate Constraints Abstract Neural NetworkAbstract Genome Biological Neural Network Biological Genome = A instantiates B = A encodes B Approach: Multiple Levels of Encoding
52
Connectivity geometry Assume 3-d grid geometry V C ‘E’ ‘N’ ‘back’
53
C V O NSET x 0 segment: | S S V O | N S x 0 V O segment: N&S S V O
54
Correspondence units grow north & west and connect with input & output units. Output units grow east and connect Connectivity: P ARSE Input units grow south and connect
55
To be encoded How many different kinds of units are there? What information is necessary (from the source unit’s point of view) to identify the location of a target unit, and the strength of the connection with it? How are constraints initially specified? How are they maintained through the learning process?
56
Unit types Input unitsCV Output unitsCVx Correspondence unitsCV 7 distinct unit types Each represented in a distinct sub- region of the abstract genome ‘Help ourselves’ to implicit machinery to spell out these sub-regions as distinct cell types, located in grid as illustrated
57
Direction of projection growth Topographic organizations widely attested throughout neural structures –Activity-dependent growth a possible alternative Orientation information (axes) –Chemical gradients during development –Cell age a possible alternative
58
Projection parameters Direction Extent –Local –Non-local Target unit type Strength of connections encoded separately
59
Connectivity Genome Contributions from O NSET and P ARSE : Source: CICI VIVI COCO VOVOC VCVC xoxo Projec- tions : S LC C S L V C E L C C E L V C N&S S V O N S x 0 N L C I W L C O N L V I W L V O S S V O Key: DirectionExtentTarget N(orth) S(outh) E(ast) W(est) F(ront) B(ack) L(ong) S(hort)Input: C I V I Output: C O V O x (0) Corr: V C C C
60
CVGenome: Connectivity
61
Encoding connection strength For each constraint i, need to ‘embody’ –Constraint strength s i –Connection coefficients (Φ Ψ cell types) Product of these is contribution of i to the Φ Ψ connection weight Network-level specification —
62
Φ Ψ Processing [P 1 ] ∝ s 1
63
Φ Ψ Development
64
Φ Ψ Learning (during phase P + ; reverse during P )
65
CVGenome: Connection Coefficients
66
C-C: C ORRESPOND : Abstract Gene Map General Developmental MachineryConnectivityConstraint Coefficients S L C C S L V C F S V C N/E L C C &V C S/W L C C &V C directionextenttarget C-I: V-I: G C O &V&x B 1 C C &V C B 2 C C C I &C O 1V C V I &V O 1 R ESPOND : G
67
UGenomics Realization of processing and learning algorithms in ‘abstract molecular biology’, using the types of interactions known to be biologically possible and genetically encodable
68
UGenomics Host of questions to address –Will this really work? –Can it be generalized to distributed nets? –Is the number of genes [77=0.26%] plausible? –Are the mechanisms truly biologically plausible? –Is it evolvable? How is strict domination to be handled?
69
Hopeful Conclusion Progress is possible toward a Grand Unified Theory of the cognitive science of language –addressing the structure, acquisition, use, and neural realization of knowledge of language –strongly governed by universal grammar –with markedness as the unifying principle –as formalized in Optimality Theory at the symbolic level –and realized via Harmony Theory in abstract neural nets which are potentially encodable genetically
70
€ Thank you for your attention (and indulgence) Hopeful Conclusion Progress is possible toward a Grand Unified Theory of the cognitive science of language Still lots of promissory notes, but all in a common currency — Harmony ≈ unmarkedness; hopefully this will promote further progress by facilitating integration of the sub-disciplines of cognitive science
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.