slides derived from those of Eva Mok and Joe Makin March 21, 2007

Slides:



Advertisements
Similar presentations
Programming Language Concepts
Advertisements

Semantics Static semantics Dynamic semantics attribute grammars
Neural networks Introduction Fitting neural networks
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
ICE1341 Programming Languages Spring 2005 Lecture #5 Lecture #5 In-Young Ko iko.AT. icu.ac.kr iko.AT. icu.ac.kr Information and Communications University.
Jeopardy Root Words Language Arts Prefixes Suffixes Q $100 Q $100
Regier Model Lecture Jerome A. Feldman February 28, 2006 With help from Matt Gedigian.
Introduction to Computability Theory
Region labelling Giving a region a name. Image Processing and Computer Vision: 62 Introduction Region detection isolated regions Region description properties.
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson.
CS 182 Sections slides created by Eva Mok modified by JGM March 22, 2006.
The Neural Basis of Thought and Language Final Review Session.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
CS 182 Sections slides created by: Eva Mok modified by jgm April 26, 2006.
CS 182 Sections Eva Mok Feb 11, 2004 ( bad puns alert!
CS 182 Sections 101 & 102 Joseph Makin Jan 19, 2004 Slides created by Eva Mok Thanks, Eva!
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
CS 182 Sections 101 & 102 slides created by: Eva Mok modified by JGM Jan. 25, 2006.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
The College of Saint Rose CIS 433 – Programming Languages David Goldschmidt, Ph.D. from Concepts of Programming Languages, 9th edition by Robert W. Sebesta,
June 27, 2002 HornstrupCentret1 Using Compile-time Techniques to Generate and Visualize Invariants for Algorithm Explanation Thursday, 27 June :00-13:30.
Design Pattern Interpreter By Swathi Polusani. What is an Interpreter? The Interpreter pattern describes how to define a grammar for simple languages,
CS Describing Syntax CS 3360 Spring 2012 Sec Adapted from Addison Wesley’s lecture notes (Copyright © 2004 Pearson Addison Wesley)
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
CS 182 Sections slides created by Eva Mok modified by JGM March
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
1 Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Content Abstract –Brief (
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
Parsing and Code Generation Set 24. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program,
Bayesian Optimization Algorithm, Decision Graphs, and Occam’s Razor Martin Pelikan, David E. Goldberg, and Kumara Sastry IlliGAL Report No May.
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Compiler Construction Lecture Five: Parsing - Part Two CSC 2103: Compiler Construction Lecture Five: Parsing - Part Two Joyce Nakatumba-Nabende 1.
Roadmap Probabilistic CFGs –Handling ambiguity – more likely analyses –Adding probabilities Grammar Parsing: probabilistic CYK Learning probabilities:
The Neural Basis of Thought and Language Final Review Session.
CS 182 Sections Leon Barrett with acknowledgements to Eva Mok and Joe Makin April 4, 2007.
Announcements a3 is out, due 2/13 and 2/22 11:59pm Computational: part 1 isn't too hard; part 2 is harder quiz will be graded in about a week.
CS 182 Sections slide credit to Eva Mok and Joe Makin Updated by Leon Barrett April 25, 2007.
CS 326 Programming Languages, Concepts and Implementation
Natural Language Processing
What does it mean? Notes from Robert Sebesta Programming Languages
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
slides derived from those of Eva Mok and Joe Makin March 21, 2007
slides created by Leon Barrett
with acknowledgments to Eva Mok and Joe Makin
Natural Language Processing
Data Mining Lecture 11.
Presentation by Julie Betlach 7/02/2009
CSE322 LEFT & RIGHT LINEAR REGULAR GRAMMAR
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
Thinking about grammars
CS21 Decidability and Tractability
Lecture 7: Introduction to Parsing (Syntax Analysis)
Lecture 8: Top-Down Parsing
CS4705 Natural Language Processing
Theory of Computation Lecture #
Robo sapiens = robo prospectus
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Thinking about grammars
David Kauchak CS159 – Spring 2019
Active, dynamic, interactive, system
Context Free Grammars-II
Programming Languages 2nd edition Tucker and Noonan
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Presentation transcript:

slides derived from those of Eva Mok and Joe Makin March 21, 2007 CS 182 Sections 103 - 104 slides derived from those of Eva Mok and Joe Makin March 21, 2007

The Last Stretch Cognition and Language Computation Psycholinguistics Experiments Spatial Relation Motor Control Metaphor Grammar Cognition and Language Computation Chang Model Bailey Model Narayanan Model Structured Connectionism Neural Net & Learning Regier Model SHRUTI Computational Neurobiology Triangle Nodes abstraction Biology Visual System Neural Development Quiz Midterm Finals

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Constrained Best Fit Physical example? Chemical example? Biological example?

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Bailey’s VerbLearn Model 3 Levels of representation cognitive: words, concepts computational: f-structs, x-schemas connectionist: structured models, learning rules Input: labeled hand motions (f-structs) learning: the correct number of senses for each verb the relevant features in each sense, and the probability distributions on each included feature execution: perform a hand motion based on a label

palm 0.7 grasp 0.3 extend 0.9 slide 0.9 posture elbow jnt schema [6] palm 0.9 extend 0.9 slide 0.9 accel posture elbow jnt schema [6 - 8] palm 0.9 extend 0.9 slide 0.9 accel posture elbow jnt schema [2] index 0.9 fixed 0.9 depress 0.9 accel posture elbow jnt schema 6 palm extend slide accel posture elbow jnt schema data #1 8 palm extend slide accel posture elbow jnt schema data #2 2 index fixed depress accel posture elbow jnt schema data #3 2 grasp extend slide accel posture elbow jnt schema data #4

Limitations of Bailey’s model an instance of recruitment learning (1-shot) embodied (motor control schemas) learns words and carries out action the label contains just the verb assumes that the labels are mostly correct no grammar

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Minimum Description Length Occam's Razor Constrain set of hypotheses Like “prior distribution” over hypotheses from Tom Griffiths's lecture prior distribution is more flexible

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Grammar merging How can we measure description length? complicated rules are bad lots of rules are bad measure “derivation length” alpha * size(rules) + derivationCost(rules, sentences) How can we merge rules? extract common prefix extract common suffix

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Affix Trees Data structures that help you figure out what merges are possible Each node in the tree represents a symbol, either terminal or non-terminal (we call that the “affix” in the code) Prefix Tree Suffix Tree

Prefix Tree r1: S  eat them here or there r2: S  eat them anywhere S

Prefix Merge r3: S  eat them X1 r4: X1  here or there r5: X1  anywhere r4 r5 r3 S X1 r4 r5 here r3 anywhere eat r4 or r4 r3 them there r3 X1

Have a good Spring Break!!!