slides derived from those of Eva Mok and Joe Makin March 21, 2007

Slides:



Advertisements
Similar presentations
Summer 2011 Tuesday, 8/ No supposition seems to me more natural than that there is no process in the brain correlated with associating or with.
Advertisements

Semantics Static semantics Dynamic semantics attribute grammars
1 Machine Learning: Lecture 3 Decision Tree Learning (Based on Chapter 3 of Mitchell T.., Machine Learning, 1997)
May 2006CLINT-LN Parsing1 Computational Linguistics Introduction Approaches to Parsing.
March 1, 2009 Dr. Muhammed Al-Mulhem 1 ICS 482 Natural Language Processing Probabilistic Context Free Grammars (Chapter 14) Muhammed Al-Mulhem March 1,
Jeopardy Root Words Language Arts Prefixes Suffixes Q $100 Q $100
Regier Model Lecture Jerome A. Feldman February 28, 2006 With help from Matt Gedigian.
9.012 Brain and Cognitive Sciences II Part VIII: Intro to Language & Psycholinguistics - Dr. Ted Gibson.
CS 182 Sections slides created by Eva Mok modified by JGM March 22, 2006.
The Neural Basis of Thought and Language Final Review Session.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
CS 182 Sections slides created by: Eva Mok modified by jgm April 26, 2006.
CS 182 Sections Eva Mok Feb 11, 2004 ( bad puns alert!
CS 280 Data Structures Professor John Peterson. Lexer Project Questions? Must be in by Friday – solutions will be posted after class The next project.
CS 182 Sections 101 & 102 Joseph Makin Jan 19, 2004 Slides created by Eva Mok Thanks, Eva!
Parsing SLP Chapter 13. 7/2/2015 Speech and Language Processing - Jurafsky and Martin 2 Outline  Parsing with CFGs  Bottom-up, top-down  CKY parsing.
CS 182 Sections 101 & 102 slides created by: Eva Mok modified by JGM Jan. 25, 2006.
NTL – Converging Constraints Basic concepts and words derive their meaning from embodied experience. Abstract and theoretical concepts derive their meaning.
June 27, 2002 HornstrupCentret1 Using Compile-time Techniques to Generate and Visualize Invariants for Algorithm Explanation Thursday, 27 June :00-13:30.
Design Pattern Interpreter By Swathi Polusani. What is an Interpreter? The Interpreter pattern describes how to define a grammar for simple languages,
CS Describing Syntax CS 3360 Spring 2012 Sec Adapted from Addison Wesley’s lecture notes (Copyright © 2004 Pearson Addison Wesley)
M Machine Learning F# and Accord.net. Alena Dzenisenka Software architect at Luxoft Poland Member of F# Software Foundation Board of Trustees Researcher.
CS 182 Sections slides created by Eva Mok modified by JGM March
PARSING 2 David Kauchak CS159 – Spring 2011 some slides adapted from Ray Mooney.
Decision Trees Binary output – easily extendible to multiple output classes. Takes a set of attributes for a given situation or object and outputs a yes/no.
1 Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Content Abstract –Brief (
Thought & Language. Thinking Thinking involves manipulating mental representations for a purpose. Thinking incorporates the use of: Words Mental Images.
M Machine Learning F# and Accord.net.
Announcements a3 is out, due 2/15 11:59pm Please please please start early quiz will be graded in about a week. a1 will be graded shortly—use glookup to.
Parsing and Code Generation Set 24. Parser Construction Most of the work involved in constructing a parser is carried out automatically by a program,
Machine Learning Lecture 1: Intro + Decision Trees Moshe Koppel Slides adapted from Tom Mitchell and from Dan Roth.
Pattern Recognition. What is Pattern Recognition? Pattern recognition is a sub-topic of machine learning. PR is the science that concerns the description.
Estimation of Distribution Algorithm and Genetic Programming Structure Complexity Lab,Seoul National University KIM KANGIL.
The Neural Basis of Thought and Language Final Review Session.
CS 182 Sections Leon Barrett with acknowledgements to Eva Mok and Joe Makin April 4, 2007.
slides derived from those of Eva Mok and Joe Makin March 21, 2007
Announcements a3 is out, due 2/13 and 2/22 11:59pm Computational: part 1 isn't too hard; part 2 is harder quiz will be graded in about a week.
Some PubMed search tips that you might not already know
CSC317 Greedy algorithms; Two main properties:
CS 182 Sections 101 & 102 Leon Barrett Jan 25, 2008
G. Pullaiah College of Engineering and Technology
What does it mean? Notes from Robert Sebesta Programming Languages
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
slides created by Leon Barrett
with acknowledgments to Eva Mok and Joe Makin
Presentation by Julie Betlach 7/02/2009
CSCI 5832 Natural Language Processing
CSCI 5832 Natural Language Processing
CS4705 Natural Language Processing
Thinking about grammars
Lecture 7: Introduction to Parsing (Syntax Analysis)
Machine Learning: Lecture 3
CS4705 Natural Language Processing
Theory of Computation Lecture #
Robo sapiens = robo prospectus
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
… NPDAs continued.
4. HMMs for gene finding HMM Ability to model grammar
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Lecture 14 Learning Inductive inference
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Thinking about grammars
David Kauchak CS159 – Spring 2019
Active, dynamic, interactive, system
Context Free Grammars-II
Programming Languages 2nd edition Tucker and Noonan
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Language translation Programming Language Design and Implementation (4th Edition) by T. Pratt and M. Zelkowitz Prentice Hall, 2001 Sections
Presentation transcript:

slides derived from those of Eva Mok and Joe Makin March 21, 2007 CS 182 Sections 103 - 104 slides derived from those of Eva Mok and Joe Makin March 21, 2007

The Last Stretch Cognition and Language Computation Psycholinguistics Experiments Spatial Relation Motor Control Metaphor Grammar Cognition and Language Computation Chang Model Bailey Model Narayanan Model Structured Connectionism Neural Net & Learning Regier Model SHRUTI Computational Neurobiology Triangle Nodes abstraction Biology Visual System Neural Development Quiz Midterm Finals

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Constrained Best Fit Physical example? Chemical example? Biological example?

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Bailey’s VerbLearn Model 3 Levels of representation cognitive: words, concepts computational: f-structs, x-schemas connectionist: structured models, learning rules Input: labeled hand motions (f-structs) learning: the correct number of senses for each verb the relevant features in each sense, and the probability distributions on each included feature execution: perform a hand motion based on a label

palm 0.7 grasp 0.3 extend 0.9 slide 0.9 posture elbow jnt schema [6] palm 0.9 extend 0.9 slide 0.9 accel posture elbow jnt schema [6 - 8] palm 0.9 extend 0.9 slide 0.9 accel posture elbow jnt schema [2] index 0.9 fixed 0.9 depress 0.9 accel posture elbow jnt schema 6 palm extend slide accel posture elbow jnt schema data #1 8 palm extend slide accel posture elbow jnt schema data #2 2 index fixed depress accel posture elbow jnt schema data #3 2 grasp extend slide accel posture elbow jnt schema data #4

Limitations of Bailey’s model an instance of recruitment learning (1-shot) embodied (motor control schemas) learns words and carries out action the label contains just the verb assumes that the labels are mostly correct no grammar

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Minimum Description Length Like Occam's Razor “Simplest explanation” Constrains set of hypotheses Statistics likeliest explanation Information theory easiest explanation to represent

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Grammar merging How can we measure description length? complicated rules are bad lots of rules are bad measure “derivation length” alpha * size(rules) + derivationCost(rules, sentences) How can we merge rules? extract common prefix extract common suffix

Questions What is Constrained Best Fit? How does Bailey use multiple levels of representation to learn different senses of verbs? What is Minimum Description Length? How do we use Minimum Description Length to merge a grammar? What does the prefix affix tree look like for the following sentences: eat them here or there eat them anywhere What does the affix tree look like after the best prefix merge?

Affix Trees Data structures that help you figure out what merges are possible Each node in the tree represents a symbol, either terminal or non-terminal (we call that the “affix” in the code) Prefix Tree Suffix Tree

Prefix Tree r1: S  eat them here or there r2: S  eat them anywhere S

Prefix Merge r3: S  eat them X1 r4: X1  here or there r5: X1  anywhere r4 r5 r3 S X1 r4 r5 here r3 anywhere eat r4 or r4 r3 them there r3 X1

Have a good Spring Break!!!