Download presentation
Presentation is loading. Please wait.
1
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson
2
Presented by Liu Lab Fighting for Freedom with Cultured Neurons
3
Nathan Wilson Distributed Representations, Simple Recurrent Networks, And Grammatical Structure Jeffrey L. Elman (1991) Machine Learning
4
Distributed Representations / Neural Networks are meant to capture the essence of neural computation: many small, independent units calculating very simple functions in parallel.
5
EXPLICIT RULES? Distributed Representations / Neural Networks:
6
EXPLICIT RULES? Distributed Representations / Neural Networks:
7
EXPLICIT RULES? Distributed Representations / Neural Networks: EMERGENCE!
8
Distributed Representations / Neural Networks are meant to capture the essence of neural computation: many small, independent units calculating very simple functions in parallel.
9
FeedForward Neural Network (from Sebastian’s Teaching)
10
Don’t forget the nonlinearity!
11
FeedForward Neural Network (from Sebastian’s Teaching)
12
Recurrent Network (also from Sebastian)
13
Why Apply Network / Connectionist Modeling to Language Processing? Connectionist Modeling is Good at What it Does Language is a HARD problem
14
What We Are Going to Do
15
Build a network
16
What We Are Going to Do Build a network Let it learn how to “read”
17
What We Are Going to Do Build a network Let it learn how to “read” Then test it!
18
What We Are Going to Do Build a network Let it learn how to “read” Then test it! –Give it some words in a reasonably grammatical sentence –Let it try to predict the next word, Based on what it knows about grammar
19
What We Are Going to Do Build a network Let it learn how to “read” Then test it! –Give it some words in a reasonably grammatical sentence –Let it try to predict the next word, Based on what it knows about grammar –BUT: We’re not going to tell it any of the rules
20
What We Are Going to Do Build a network
21
FeedForward Neural Network (from Sebastian’s Teaching)
22
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
23
What We Are Going to Do Build a network Let it learn how to “read”
24
Methods > Network Implementation > Training Words We’re going to Teach it: - Nouns: boy | girl | cat | dog | boys | girls | cats | dogs - Proper Nouns: John | Mary - “Who” - Verbs: chase | feed | see | hear | walk | live | chases | feeds | sees | hears | walks | lives - “End Sentence”
25
Methods > Network Implementation > Training 1. Encode Each Word with Unique Activation Pattern
26
Methods > Network Implementation > Training 1. Encode Each Word with Unique Activation Pattern - boy => 000000000000000000000001 - girl => 000000000000000000000010 -feed => 000000000000000000000100 -sees => 000000000000000000001000... -who => 010000000000000000000000 -End sentence => 100000000000000000000000
27
Methods > Network Implementation > Training 1. Encode Each Word with Unique Activation Pattern - boy => 000000000000000000000001 - girl => 000000000000000000000010 -feed => 000000000000000000000100 -sees => 000000000000000000001000... -who => 010000000000000000000000 -End sentence => 100000000000000000000000 2. Feed these words sequentially to the network (only feed words in sequences that make good grammatical sense!)
28
INPUT Methods > Network Implementation > Structure
29
1000000000000 INPUT Methods > Network Implementation > Structure
30
1000000000000 INPUT HIDDEN Methods > Network Implementation > Structure
31
1000000000000 100100100100100100100100 INPUT HIDDEN Methods > Network Implementation > Structure
32
1000000000000 100100100100100100100100 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
33
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
34
Methods > Network Implementation > Training 1. Encode Each Word with Unique Activation Pattern - boy => 000000000000000000000001 - girl => 000000000000000000000010 -feed => 000000000000000000000100 -sees => 000000000000000000001000... -who => 010000000000000000000000 -End sentence => 100000000000000000000000 2. Feed these words sequentially to the network (only feed words in sequences that make good grammatical sense!)
35
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
36
What We Are Going to Do Build a network Let it learn how to “read”
37
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
38
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure If learning word relations, need some sort of memory from word to word!
39
FeedForward Neural Network (from Sebastian’s Teaching)
40
Recurrent Network (also from Sebastian)
41
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT Methods > Network Implementation > Structure
42
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure
43
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure
44
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure
45
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure
46
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure BACKPROP!
47
What We Are Going to Do Build a network Let it learn how to “read” Then test it! –Give it some words in a reasonably grammatical sentence –Let it try to predict the next word, Based on what it knows about grammar –BUT: We’re not going to tell it any of the rules
48
-After Hearing: “boy….” -Network SHOULD predict next word is: “chases” -NOT: “chase” Subject and verb should agree! Results > Emergent Properties of Network > Subject-Verb Agreement
49
-After Hearing: “boy….” -Network SHOULD predict next word is: “chases” -NOT: “chase” Subject and verb should agree! Results > Emergent Properties of Network > Noun-Verb Agreement
50
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy…..
51
Results > Emergent Properties of Network > Noun-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy…..
52
-Likewise, after Hearing: “boys….” (or boyz!) -Network SHOULD predict next word is: “chase” -NOT: “chases” Again, subject and verb should agree! Results > Emergent Properties of Network > Noun-Verb Agreement
53
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys…..
54
Results > Emergent Properties of Network > Noun-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys…..
55
Results > Emergent Properties of Network > Noun-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys….. There’s a difference between nouns and verbs. There are even different kinds of nouns that require different kinds of verbs.
56
-After Hearing: “chase” -Network SHOULD predict next word is: “some direct object (like ”boys”) -NOT: “. ” Hey, if a verb needs an argument, it only makes sense to give it one! Results > Emergent Properties of Network > Verb-Argument Agreement
57
-Likewise, after hearing the verb: “lives” -Network SHOULD predict next word is: “. “ -NOT: “dog” If the verb doesn’t make sense with an argument, It falls upon us to withhold one from it. Results > Emergent Properties of Network > Verb-Argument Agreement
58
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy chases….. Results > Emergent Properties of Network > Verb-Argument Agreement
59
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy chases….. Results > Emergent Properties of Network > Verb-Argument Agreement
60
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy lives….. Results > Emergent Properties of Network > Verb-Argument Agreement
61
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy lives….. Results > Emergent Properties of Network > Verb-Argument Agreement
62
0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boy lives….. Results > Emergent Properties of Network > Verb-Argument Agreement There are different kinds of verbs that require different kinds of nouns.
63
-After hearing: “boy who mary chases…” -Network might predict next word is: “boys“ Since it learned that “boys” follows “mary chases” But if it’s smart: may realize that “chases” is linked to “boys”, not “mary” In which case you need a verb next, not a noun! A good lithmus test for some intermediate understanding? Results > Emergent Properties of Network > Longer-Range Dependence
64
Results > Emergent Properties of Network > Verb-Argument Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys who Mary…..
65
Results > Emergent Properties of Network > Verb-Argument Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys who Mary…..
66
Results > Emergent Properties of Network > Subject-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys who mary chases…..
67
Results > Emergent Properties of Network > Subject-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys who mary chases feed…..
68
Results > Emergent Properties of Network > Subject-Verb Agreement 0.00.2 0.40.60.81.0 Activation Single Noun Plural Noun Single Verb, DO Optional “Who” Single Verb, DO Required Single Verb, DO Impossible Plural Verb, DO Optional Plural Verb, DO Required Plural Verb, DO Impossible End of Sentence What Word Network Predicts is Next boys who mary chases feed cats…..
69
What We Are Going to Do Build a network Let it learn how to “read” Then test it! –Give it some words in a reasonably grammatical sentence –Let it try to predict the next word, Based on what it knows about grammar –BUT: We’re not going to tell it any of the rules
70
Did Network Learn About Grammar? It learned there are different classes of nouns that need singular and plural verbs. It learned there are different classes of verbs that have diff. requirements in terms of direct objects. It learned that sometimes there are long-distance dependencies that don’t follow from immediately preceding words –=> relative clauses and constituent structure of sentences.
72
Once You Have a Successful Network, can Examine its Properties with Controlled I/O Relationships Boys hear boys Boy hears boys. Boy who boys chase chases boys. Boys who boys chase chase boys.
73
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure BACKPROP!
74
EXPLICIT RULES? Distributed Representations / Neural Networks:
75
EXPLICIT RULES? Distributed Representations / Neural Networks:
76
1000000000000 100100100100100100100100 0000000000001 INPUT HIDDEN OUTPUT CONTEXT Methods > Network Implementation > Structure BACKPROP!
77
What Does it Mean, “No Explicit Rules?” Does it just mean the mapping is “too complicated?” “Too difficult to formulate?” “Unknown?” Possibly just our own failure to understand the mechanism, rather than description of mechanism itself.
78
General Advantages of Distributed Models Distributed, which while not limitless, is less rigid than models where there is strict mapping from concept to node. Generalizations are captured at a higher level than input – abstractly. So generalization to new input is possible.
79
FOUND / ISOLATED 4-CELL NEURAL NETWORKS
81
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson
82
Presented by Liu Lab Fighting for Freedom with Cultured Neurons
83
“If you have built castles in the air, your work need not be lost; that is where they should be. Now put the foundations under them.” -- Henry David Thoreau
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.