The Neural Basis of Thought and Language Week 15 The End is near...

Slides:



Advertisements
Similar presentations
The Structure of Sentences Asian 401
Advertisements

Semantics (Representing Meaning)
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 4.
Grammatical Relations and Lexical Functional Grammar Grammar Formalisms Spring Term 2004.
Statistical NLP: Lecture 3
The study of how words combine to form grammatical sentences.
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
LING NLP 1 Introduction to Computational Linguistics Martha Palmer April 19, 2006.
Syntax 2nd class Chapter 4.
1 Pertemuan 23 Syntatic Processing Matakuliah: T0264/Intelijensia Semu Tahun: 2005 Versi: 1/0.
Multiple constraints in action
Syntax: The Sentence Patterns of Language Deny A. Kwary Airlangga University.
Amirkabir University of Technology Computer Engineering Faculty AILAB Efficient Parsing Ahmad Abdollahzadeh Barfouroush Aban 1381 Natural Language Processing.
The Neural Basis of Thought and Language Final Review Session.
Psy1302 Psychology of Language Lecture 12 Sentence Comprehension II.
Language, Mind, and Brain by Ewa Dabrowska Chapter 2: Language processing: speed and flexibility.
Day 2: Pruning continued; begin competition models
1/17 Probabilistic Parsing … and some other approaches.
Understanding Sentences. Two steps back: What is linguistic knowledge? Phonological Syntactical Morphological Lexical Semantic.
PSY 369: Psycholinguistics Language Comprehension: Sentence comprehension.
Intro to Psycholinguistics What its experiments are teaching us about language processing and production.
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
THE PASSIVE UNIT 19. Passive vs Active Sentence The president asked the employees to speak English. (active) The employees were asked to speak English.
English 306A; Harris 1 Syntax Word patterns. English 306A; Harris 2 Syntactic arguments Syntactic form Sentence patterns Grammatical roles Phrase structure.
Phrases and Sentences: Grammar
 Final: This classroom  Course evaluations Final Review.
©Suzanne Ryan 1. Elephant Painting Welcome to English Composition 3.
Semantics 3rd class Chapter 5.
BİL711 Natural Language Processing1 Statistical Parse Disambiguation Problem: –How do we disambiguate among a set of parses of a given sentence? –We want.
1.Syntax: the rules of sentence formation; the component of the mental grammar that represent speakers’ knowledge of the structure of phrase and sentence.
The Five Basic Sentence Types. The Five Sentence Types I.Intransitive II.Linking Verb Be plus adverbial of time or place III.Linking Verb plus adjectival.
1 Natural Language Processing Lecture 11 Efficient Parsing Reading: James Allen NLU (Chapter 6)
Syntax IV November 23, Weekday Update Syntax homework will be posted after class today …due on Wednesday (November 28th) Next week, we will start.
October 15, 2007 Non-finite clauses and control : Grammars and Lexicons Lori Levin.
PS: Introduction to Psycholinguistics Winter Term 2005/06 Instructor: Daniel Wiechmann Office hours: Mon 2-3 pm Phone:
Time, Tense and Aspect Rajat Kumar Mohanty Centre For Indian Language Technology Department of Computer Science and Engineering Indian.
PARSING David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Notes on Pinker ch.7 Grammar, parsing, meaning. What is a grammar? A grammar is a code or function that is a database specifying what kind of sounds correspond.
ENGLISH SYNTAX Introduction to Transformational Grammar.
For Wednesday Read chapter 23 Homework: –Chapter 22, exercises 1,4, 7, and 14.
Linguistic Essentials
Lexicalized and Probabilistic Parsing Read J & M Chapter 12.
CTM 2. EXAM 2 Exam 1 Exam 2 Letter Grades Statistics Mean: 60 Median: 56 Modes: 51, 76.
Rules, Movement, Ambiguity
Results of Eyetracking & Self-Paced Moving Window Studies DO-Bias Verbs: The referees warned the spectators would probably get too rowdy. The referees.
The meaning of Language Chapter 5 Semantics and Pragmatics Week10 Nov.19 th -23 rd.
Making it stick together…
Dec 11, Human Parsing Do people use probabilities for parsing?! Sentence processing Study of Human Parsing.
A TIME TO REMEMBER.... Unit 3. T HINGS TO DO TODAY Find the past form of the verbs on page 12 and complete the story. Read again the story and analyze.
SYNTAX.
SYNTAX 1 NOV 9, 2015 – DAY 31 Brain & Language LING NSCI Fall 2015.
Language and Cognition Colombo, June 2011 Day 2 Introduction to Linguistic Theory, Part 3.
3.3 A More Detailed Look At Transformations Inversion (revised): Move Infl to C. Do Insertion: Insert interrogative do into an empty.
Final Review  Syntax  Semantics/Pragmatics  Sociolinguistics  FINAL will be part open book, and part closed book  Will use similar tasks as Problem.
September 26, : Grammars and Lexicons Lori Levin.
Syntactic Priming in Sentence Comprehension (Tooley, Traxler & Swaab, 2009) Zhenghan Qi.
Lec. 10.  In this section we explain which constituents of a sentence are minimally required, and why. We first provide an informal discussion and then.
NLP Midterm Solution #1 bilingual corpora –parallel corpus (document-aligned, sentence-aligned, word-aligned) (4) –comparable corpus (4) Source.
Chapter 4 Syntax a branch of linguistics that studies how words are combined to form sentences and the rules that govern the formation of sentences.
Bell Work Corrected 1.Did you receive the espresso machine for Christmas? 2.My dog gets into so much mischief, but I love her anyway. 3.That movie was.
The Neural Basis of Thought and Language Week 14.
Lecture 2: Categories and Subcategorisation
Statistical NLP: Lecture 3
Chapter Eight Syntax.
PRESENT & PAST TENSES.
Chapter Eight Syntax.
Choosing the Correct Verb Tense: Past Tenses
Linguistic Essentials
A User study on Conversational Software
Structure of a Lexicon Debasri Chakrabarti 13-May-19.
Presentation transcript:

The Neural Basis of Thought and Language Week 15 The End is near...

Schedule Final review Sunday May 8 th ? Final paper due Tuesday, May 10 th, 11:59pm Final exam Tuesday, May 10 th in class Last Week –Psychological model of sentence processing –Applications This Week –Wrap-Up

Bayesian Model of Sentence Processing What is it calculating? What computational components is it composed of? What is it used to predict? What phenomena does it explain?

Bayesian Model of Sentence Processing Situation –You’re in a conversation. Do you wait for sentence boundaries to interpret the meaning of a sentence? No! – After only the first half of a sentence... meaning of words can be ambiguous but you still have an expectation Model –Probability of each interpretation given words seen –Stochastic CFGs, Lexical valence probabilities, N-Grams

Lexical Valence Probability Syntactic Category: –S-bias verbs (e.g. suspect) / NP-bias verbs (e.g. remember) –Transitive (e.g. walk the dog)/ Intransitive (e.g. walk to school) –Participle-bias (VBD; perfect tense) (e.g. selected)/ Preterite-bias (VBN; simple past tense) (e.g. searched) Semantic Fit (Thematic Fit): –cop, witness: good agents –crook, evidence: good patients

SCFG “that” as a COMP (complementizer): –[OK] The lawyer insisted that experienced diplomats would be very helpful –That experienced diplomats would be very helpful made the lawyer confident. “that” a DET (determiner): –The lawyer insisted that experienced diplomat would be very helpful –[OK] That experienced diplomat would be very helpful to the lawyer. Sentence-initial that interpreted as complementizer is infrequent Post-verbal that interpreted as determiner is infrequent P(S → SBAR VP) = P(S → NP...) =.996

N-gram P(w i | w i-1, w i-2, …, w i-n ) probability of one word appearing given the preceeding n words “take advantage” (high probability) “take celebration” (low probability)

SCFG + N-gram Main Verb Reduced Relative S NPVP DNVBD Thecoparrestedthedetective S NPVP NPVP DNVBNPP Thecoparrestedby

Predicting effects on reading time Probability predicts human disambiguation Increase in reading time because of... –Limited Parallelism Memory limitations cause correct interpretation to be pruned The horse raced past the barn fell –Attention Demotion of interpretation in attentional focus –Expectation Unexpected words

A good agent (e.g. the cop, the witness) makes the main verb reading more likely initially… and the reduced relative reading less likely as one hears the word by, the RR reading becomes the more likely one: shift in attention shift in attention → slower reading time lexical valence probability (semantic fit) predicts slower reading time The witness examined by the lawyer

A good patient (e.g. the crook, the evidence) makes the RR reading more likely initially… and the MV reading less likely no effect as one hears the word by, the ranking of the two readings do not change → no effect on reading time lexical valence probability (semantic fit) agrees with the RR reading The evidence examined by the lawyer

The athlete realized her (exercises | potential) one day might make her a world... Expectation Direct Object/Sentential Complement Ambiguity. Delay from Expectation.

GOOD LUCK!