The Neural Basis of Thought and Language Final Review Session.

Slides:



Advertisements
Similar presentations
The Robert Gordon University School of Engineering Dr. Mohamed Amish
Advertisements

CSCTR Session 8 Dana Retová. group at UC Berkeley & Uni of Hawaii Nancy Chang Benjamin Bergen Jerome Feldman, … General assumption Semantic relations.
The influence of domain priors on intervention strategy Neil Bramley.
Peer-to-peer and agent-based computing Agent-Based Computing: tools, languages and case studies (Cont’d)
Semantics Static semantics Dynamic semantics attribute grammars
Neural Network Models in Vision Peter Andras
Semantics (Representing Meaning)
Embodied Construction Grammar in language (acquisition and) use Jerome Feldman Computer Science Division, University of California,
Embodied Compositional Semantics Ellen Dodge
Introduction and Jurafsky Model Resource: A Probabilistic Model of Lexical and Syntactic Access and Disambiguation, Jurafsky 1996.
ISBN Chapter 3 Describing Syntax and Semantics.
CS 355 – Programming Languages
Language (and Decomposition). Linguistics provides… a highly articulated “computational” (generative) theory of the mental representations of language.
Unified Cognitive Science Neurobiology Psychology Computer Science Linguistics Philosophy Social Sciences Experience Take all the Findings and Constraints.
Bayes Rule How is this rule derived? Using Bayes rule for probabilistic inference: –P(Cause | Evidence): diagnostic probability –P(Evidence | Cause): causal.
CS 182 Sections slides created by Eva Mok modified by JGM March 22, 2006.
Unified Cognitive Science Neurobiology Psychology Computer Science Linguistics Philosophy Social Sciences Experience Take all the Findings and Constraints.
Jump to first page Computationalism: The Very Idea David Davenport Bilkent University, Ankara – Turkey. disclaimer (non-standard!)
CS 182 Sections slides created by Eva Mok modified by jgm April 13, 2005.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
Tom Griffiths CogSci C131/Psych C123 Computational Models of Cognition.
CS 182 Sections slides created by: Eva Mok modified by jgm April 26, 2006.
Tracking multiple independent targets: Evidence for a parallel tracking mechanism Zenon Pylyshyn and Ron Storm presented by Nick Howe.
CS 182 Sections Eva Mok April 21, 2004.
The Neural Basis of Thought and Language Week 15 The End is near...
 Lokendra Shastri ICSI, Berkeley A Neurally Plausible model of Reasoning  Lokendra Shastri ICSI, Berkeley Lokendra Shastri International Computer Science.
The Neural Basis of Thought and Language Week 11 Metaphor and Bayes Nets.
Describing Syntax and Semantics
Natural Language Understanding
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
ITEC224 Database Programming
Author: James Allen, Nathanael Chambers, etc. By: Rex, Linger, Xiaoyi Nov. 23, 2009.
Bayesian Learning By Porchelvi Vijayakumar. Cognitive Science Current Problem: How do children learn and how do they get it right?
Probabilistic Context Free Grammars for Representing Action Song Mao November 14, 2000.
A Cognitive Substrate for Natural Language Understanding Nick Cassimatis Arthi Murugesan Magdalena Bugajska.
Evolution of Control-Related Mental Models Crystal A. Brandon.
Introduction to Embodied Construction Grammar March 4, 2003 Ben Bergen
Machine Learning Chapter 5. Artificial IntelligenceChapter 52 Learning 1. Rote learning rote( โรท ) n. วิถีทาง, ทางเดิน, วิธีการตามปกติ, (by rote จากความทรงจำ.
3.2 Semantics. 2 Semantics Attribute Grammars The Meanings of Programs: Semantics Sebesta Chapter 3.
Programming Languages and Design Lecture 3 Semantic Specifications of Programming Languages Instructor: Li Ma Department of Computer Science Texas Southern.
Chapter 7. Learning through Imitation and Exploration: Towards Humanoid Robots that Learn from Humans in Creating Brain-like Intelligence. Course: Robots.
Motor Control Theories.  1. The patterning of body and limb motions relative to the patterning of environmental objects and events.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
OPERATING SYSTEMS CS 3530 Summer 2014 Systems and Models Chapter 03.
Chapter Twelve The Cognitive Perspective. Schemas and Their Development Schema—a mental organization of information –Perceptual images –Abstract knowledge.
Motor Behavior Chapter 5. Motor Behavior Define motor behavior, motor development, motor control, and motor learning. What is the influence of readiness,
Understanding Naturally Conveyed Explanations of Device Behavior Michael Oltmans and Randall Davis MIT Artificial Intelligence Lab.
Issues in Temporal and Causal Inference Pei Wang Temple University, USA Patrick Hammer Graz University of Technology, Austria.
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
Estimation of Distribution Algorithm and Genetic Programming Structure Complexity Lab,Seoul National University KIM KANGIL.
The Neural Basis of Thought and Language Final Review Session.
The Neural Basis of Thought and Language Final Review Session.
The Neural Basis of Thought and Language Week 11 Metaphor and Automatic Reasoning.
CS 182 Sections Leon Barrett with acknowledgements to Eva Mok and Joe Makin April 4, 2007.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
The Neural Basis of Thought and Language Week 14.
CS 182 Sections Leon Barrett with slides inspired by Eva Mok and Joe Makin April 18, 2007.
The Neural Basis of Thought and Language Week 14.
Announcements a3 is out, due 2/13 and 2/22 11:59pm Computational: part 1 isn't too hard; part 2 is harder quiz will be graded in about a week.
The Neural Basis of Thought and Language Week 12 Metaphor and Automatic Reasoning, plus Grammars.
CS 182 Sections slide credit to Eva Mok and Joe Makin Updated by Leon Barrett April 25, 2007.
What is cognitive psychology?
Cognitive Language Processing for Rosie
“Perceptual Symbol Systems”
ICS 280 Learning in Graphical Models
slides derived from those of Eva Mok and Joe Makin March 21, 2007
with acknowledgments to Eva Mok and Joe Makin
From Molecule to Metaphor by Jerome Feldman
CS4705 Natural Language Processing
Motor Behavior.
Presentation transcript:

The Neural Basis of Thought and Language Final Review Session

Administrivia Final in class next Tuesday, May 9 th Be there on time! Format: –closed books, closed notes –short answers, no blue books And then you’re done with the course!

The Second Half Cognition and Language Computation Structured Connectionism Computational Neurobiology Biology Midterm Final abstraction Motor Control Metaphor Grammar Bailey Model KARMA ECG SHRUTI Bayesian Model of HSP Bayes Nets

Overview Bailey Model –feature structures –Bayesian model merging –recruitment learning KARMA –X-schema, frames –aspect –event-structure metaphor –inference Grammar Learning –parsing –construction grammar –learning algorithm SHRUTI FrameNet Bayesian Model of Human Sentence Processing

Full Circle Neural System & Development Motor Control & Visual System Spatial Relation Psycholinguistics Experiments Metaphor Grammar Verbs & Spatial Relation Embodied Representation Structured Connectionism Probabilistic algorithms Converging Constraints

Q & A

How can we capture the difference between “Harry walked into the cafe.” “Harry is walking into the cafe.” “Harry walked into the wall.”

Analysis Process Utterance Simulation Belief State General Knowledge Constructions Semantic Specification “Harry walked into the café.”

The INTO construction construction INTO subcase of Spatial-Relation form self f.orth ← “into” meaning: Trajector-Landmark evokes Container as cont evokes Source-Path-Goal as spg trajector ↔ spg.trajector landmark ↔ cont cont.interior ↔ spg.goal cont.exterior ↔ spg.source

The Spatial-Phrase construction construction S PATIAL- P HRASE constructional constituents sr : Spatial-Relation lm : Ref-Expr form sr f before lm f meaning sr m.landmark ↔ lm m

The Directed-Motion construction construction DIRECTED-MOTION constructional constituents a : Ref-Exp m: Motion-Verb p : Spatial-Phrase form a f before m f m f before p f meaning evokes Directed-Motion as dm self m.scene ↔ dm dm.agent ↔ a m dm.motion ↔ m m dm.path ↔ p m schema Directed-Motion roles agent : Entity motion : Motion path : SPG

What exactly is simulation? Belief update plus X-schema execution hungrymeeting cafe time of day ready start ongoing finish done iterate WALK at goal

“Harry walked into the café.” ready walk done walker=Harrygoal=cafe

Analysis Process Utterance Simulation Belief State General Knowledge Constructions Semantic Specification “Harry is walking to the café.”

ready start ongoing finish done iterateabort cancelled interruptresume suspended WALK walker=Harrygoal=cafe

Analysis Process Utterance Simulation Belief State General Knowledge Constructions Semantic Specification “Harry has walked into the wall.”

Perhaps a different sense of INTO? construction INTO subcase of spatial-prep form self f.orth ← “into” meaning evokes Trajector-Landmark as tl evokes Container as cont evokes Source-Path-Goal as spg tl.trajector ↔ spg.trajector tl.landmark ↔ cont cont.interior ↔ spg.goal cont.exterior ↔ spg.source construction INTO subcase of spatial-prep form self f.orth ← “into” meaning evokes Trajector-Landmark as tl evokes Impact as im evokes Source-Path-Goal as spg tl.trajector ↔ spg.trajector tl.landmark ↔ spg.goal im.obj1 ↔ tl.trajector im.obj2 ↔ tl.landmark

“Harry has walked into the wall.” ready start ongoing finish done iterateabort cancelled interruptresume suspended WALK walker=Harrygoal=wall

Map down to timeline S R E ready start ongoing finish done consequence

further questions?

What about… “Harry walked into trouble” or for stronger emphasis, “Harry walked into trouble, eyes wide open.”

Metaphors metaphors are mappings from a source domain to a target domain metaphor maps specify the correlation between source domain entities / relation and target domain entities / relation they also allow inference to transfer from source domain to target domain (possibly, but less frequently, vice versa) is

Event Structure Metaphor Target Domain: event structure Source Domain: physical space States are Locations Changes are Movements Causes are Forces Causation is Forced Movement Actions are Self-propelled Movements Purposes are Destinations Means are Paths Difficulties are Impediments to Motion External Events are Large, Moving Objects Long-term Purposeful Activities are Journeys

KARMA DBN to represent target domain knowledge Metaphor maps link target and source domain X-schema to represent source domain knowledge

Metaphor Maps 1.map entities and objects between embodied and abstract domains 2.invariantly map the aspect of the embodied domain event onto the target domain by setting the evidence for the status variable based on controller state (event structure metaphor) 3.project x-schema parameters onto the target domain

further questions?

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions?

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? That’s the Regier model. (first half of semester)

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? VerbLearn

schemaelbow jntpostureaccel slideextendpalm6 schemaelbow jntpostureaccel slideextendpalm8 schemaelbow jntpostureaccel slide 0.9extend 0.9palm 0.9[6] data #1 data #2 data #3 data #4 schemaelbow jntpostureaccel depress 0.9fixed 0.9index 0.9[2] schemaelbow jntpostureaccel slide 0.9extend 0.9palm 0.9[6 - 8] schemaelbow jntposture slide 0.9extend 0.9palm 0.7 grasp 0.3 schemaelbow jntpostureaccel depressfixedindex2 schemaelbow jntpostureaccel slideextendgrasp2

Computational Details complexity of model + ability to explain data maximum a posteriori (MAP) hypothesis how likely is the data given this model? penalize complex models – those with too many word senses wants the best model given data

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? conflation hypothesis (primary metaphors)

How do you learn… the meanings of spatial relations, the meanings of verbs, the metaphors, and the constructions? construction learning

Acquisition Reorganize Hypothesize Production Utterance (Comm. Intent, Situation) Generate Constructions (Utterance, Situation) Analysis Comprehension Analyze Partial Usage-based Language Learning

Main Learning Loop while available and cost > stoppingCriterion analysis = analyzeAndResolve(utterance, situation, currentGrammar); newCxns = hypothesize(analysis); if cost(currentGrammar + newCxns) < cost(currentGrammar) addNewCxns(newCxns); if (re-oganize == true) // frequency depends on learning parameter reorganizeCxns();

Three ways to get new constructions Relational mapping –throw the ball Merging –throw the block –throwing the ball Composing –throw the ball –ball off –you throw the ball off THROW < BALL < OFFTHROW < OBJECT THROW < BALL

Minimum Description Length Choose grammar G to minimize cost(G|D): –cost(G|D) = α size(G) + β complexity(D|G) –Approximates Bayesian learning; cost(G|D) ≈ posterior probability P(G|D) Size of grammar = size(G) ≈ 1/prior P(G) –favor fewer/smaller constructions/roles; isomorphic mappings Complexity of data given grammar ≈ 1/likelihood P(D|G) –favor simpler analyses (fewer, more likely constructions) –based on derivation length + score of derivation

further questions?

Connectionist Representation How can entities and relations be represented at the structured connectionist level? or How can we represent Harry walked to the café in a connectionist model?

SHRUTI entity, type, and predicate focal clusters An “entity” is a phase in the rhythmic activity. Bindings are synchronous firings of role and entity cells Rules are interconnection patterns mediated by coincidence detector circuits that allow selective propagation of activity An episode of reflexive processing is a transient propagation of rhythmic activity

asserting that walk(Harry, café) Harry fires in phase with agent role cafe fires in phase with goal role + - ? agtgoal +e+v?e?v +? walk cafe Harry type entity predicate “Harry walked to the café.”

asserting that walk(Harry, café) Harry fires in phase with agent role cafe fires in phase with goal role + - ? agtgoal +e+v?e?v +? walk cafe Harry type entity predicate “Harry walked to the café.”

Activation Trace for walk(Harry, café) +: Harry +: walk +e: cafe walk-agt walk-goal 1234

further questions?

Human Sentence Processing Can we use any of the mechanisms we just discussed to predict reaction time / behavior when human subjects read sentences?

Good and Bad News Bad news: –No, not as it is. –ECG, the analysis process and simulation process are represented at a higher computational level of abstraction than human sentence processing (lacks timing information, requirement on cognitive capacity, etc) Good news: –we can construct bayesian model of human sentence processing behavior borrowing the same insights

Bayesian Model of Sentence Processing Do you wait for sentence boundaries to interpret the meaning of a sentence? No! As words come in, we construct –partial meaning representation –some candidate interpretations if ambiguous –expectation for the next words Model –Probability of each interpretation given words seen –Stochastic CFGs, N-Grams, Lexical valence probabilities

SCFG + N-gram Reduced Relative Main Verb S NPVP DNVBN Thecoparrestedthedetective S NPVP NPVP DNVBDPP Thecoparrestedby Stochastic CFG

SCFG + N-gram Main Verb Reduced Relative S NPVP DNVBN Thecoparrestedthedetective S NPVP NPVP DNVBDPP Thecoparrestedby N-Gram

SCFG + N-gram Main Verb Reduced Relative S NPVP DNVBN Thecoparrestedthedetective S NPVP NPVP DNVBDPP Thecoparrestedby Different Interpretations

Predicting effects on reading time Probability predicts human disambiguation Increase in reading time because of... –Limited Parallelism Memory limitations cause correct interpretation to be pruned The horse raced past the barn fell –Attention Demotion of interpretation in attentional focus –Expectation Unexpected words

Open for questions