Conceptual Puzzles & Theoretical Elegance (principledness. naturalness

Slides:



Advertisements
Similar presentations
Optimality Theory (OT) Prepared and presented by: Abdullah Bosaad & Liú Chàng Spring 2011.
Advertisements

Optimality Theory Presented by Ashour Abdulaziz, Eric Dodson, Jessica Hanson, and Teresa Li.
Feature Selection as Relevant Information Encoding Naftali Tishby School of Computer Science and Engineering The Hebrew University, Jerusalem, Israel NIPS.
Force Directed Scheduling Used as priority function. Force is related to concurrency. Sort operations for least force. Mechanical analogy: Force = constant.
Jun Zhu Dept. of Comp. Sci. & Tech., Tsinghua University This work was done when I was a visiting researcher at CMU. Joint.
Language, Cognition and Optimality Henriëtte de Swart ESSLLI 2008, Hamburg.
Statistical Methods Chichang Jou Tamkang University.
Relevance Feedback based on Parameter Estimation of Target Distribution K. C. Sia and Irwin King Department of Computer Science & Engineering The Chinese.
Machine Learning CMPT 726 Simon Fraser University
Theory and Applications
Lecture 10 Matroid. Independent System Consider a finite set S and a collection C of subsets of S. (S,C) is called an independent system if i.e., it is.
January 24-25, 2003Workshop on Markedness and the Lexicon1 On the Priority of Markedness Paul Smolensky Cognitive Science Department Johns Hopkins University.
Jakobson's Grand Unified Theory of Linguistic Cognition Paul Smolensky Cognitive Science Department Johns Hopkins University Elliott Moreton Karen Arnold.
Markedness Optimization in Grammar and Cognition Paul Smolensky Cognitive Science Department Johns Hopkins University Elliott Moreton Karen Arnold Donald.
Topics, Summer 2008 Day 1. Introduction models and the world (probability & frequency) types of data (nominal, count, interval, ratio) some R basics (read.table,
CJT 765: Structural Equation Modeling Class 7: fitting a model, fit indices, comparingmodels, statistical power.
Particle Filters for Shape Correspondence Presenter: Jingting Zeng.
Formal Typology: Explanation in Optimality Theory Paul Smolensky Cognitive Science Department Johns Hopkins University Géraldine Legendre Donald Mathis.
Harmonic Ascent  Getting better all the time Timestamp: Jul 25, 2005.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Developing and Evaluating Theories of Behavior.
1 CHAPTER 2 Decision Making, Systems, Modeling, and Support.
Optimality in Cognition and Grammar Paul Smolensky Cognitive Science Department, Johns Hopkins University Plan of lectures 1.Cognitive architecture: Symbols.
Models of Linguistic Choice Christopher Manning. 2 Explaining more: How do people choose to express things? What people do say has two parts: Contingent.
LANGUAGE MODELS FOR RELEVANCE FEEDBACK Lee Won Hee.
Estimating Component Availability by Dempster-Shafer Belief Networks Estimating Component Availability by Dempster-Shafer Belief Networks Lan Guo Lane.
Academic Research Academic Research Dr Kishor Bhanushali M
The phonology of Hakka zero- initials Raung-fu Chung Southern Taiwan University 2011, 05, 29, Cheng Da.
Program Structure  OT Constructs formal grammars directly from markedness principles Strongly universalist: inherent typology  OT allows completely formal.
Evaluating Models of Computation and Storage in Human Sentence Processing Thang Luong CogACLL 2015 Tim J. O’Donnell & Noah D. Goodman.
Latent Dirichlet Allocation
Multiplication Facts. 9 6 x 4 = 24 5 x 9 = 45 9 x 6 = 54.
Intro to NLP - J. Eisner1 A MAXENT viewpoint.
A Brief Maximum Entropy Tutorial Presenter: Davidson Date: 2009/02/04 Original Author: Adam Berger, 1996/07/05
Multiplication Facts. 2x2 4 8x2 16 4x2 8 3x3.
Multiplication Facts Review: x 1 = 1 10 x 8 =
Information complexity - Presented to HCI group. School of Computer Science. University of Oklahoma.
1 LING 696B: Maximum-Entropy and Random Fields. 2 Review: two worlds Statistical model and OT seem to ask different questions about learning UG: what.
Multiplication Facts All Facts. 0 x 1 2 x 1 10 x 5.
Multiplication Facts.
CHAPTER 6, INDEXES, SCALES, AND TYPOLOGIES
Chapter 2 Doing Sociological Research
Perception, interaction, and optimality
ICS 280 Learning in Graphical Models
Example of Poisson Distribution-Wars by Year
cognitive framework behind active learning
Quasi-Distinct Parsing and Optimal Compression Methods
Multiplication Facts.
CJT 765: Structural Equation Modeling
CIS 700 Advanced Machine Learning for NLP Inference Applications
Chapter 25 Comparing Counts.
Basic Probability Theory
Knowledge Representation
J. Zhu, A. Ahmed and E.P. Xing Carnegie Mellon University ICML 2009
Probability Probability is the frequency of a particular outcome occurring across a number of trials
Qun Huang, Patrick P. C. Lee, Yungang Bao
Day 4 Classic OT Although we’ve seen most of the ingredients of OT, there’s one more big thing you need to know to be able to read OT papers and listen.
CSc4730/6730 Scientific Visualization
Developing and Evaluating Theories of Behavior
Energy Fluctuations in the Canonical Ensemble
Distribution Model A smooth representation of the distribution of ALL individuals in the POPULATION Quantitative Value
A Hierarchical Bayesian Look at Some Debates in Category Learning
Probability: The Study of Randomness
Multiplication Facts.
Chapter 26 Comparing Counts.
Flow Feasibility Problems
Parsing Unrestricted Text
Chapter 26 Comparing Counts Copyright © 2009 Pearson Education, Inc.
Chapter 26 Comparing Counts.
Introduction to Probability
Searching and Organizing
Presentation transcript:

Conceptual Puzzles & Theoretical Elegance (principledness. naturalness

Need to add cognition — not just numbers, experiments, and algorithms Multiple representations are tapped differentially by different tasks/conditions (Boersma, 3:19 pm) Especially important in study of gradience Probability distribution family* *but see distribution-free methods (PAC learning) Rankings: Natural for (partial) orders? (MaxEnt: 1) Outputs: Natural for induction/generalization? (MaxEnt maximizes entropy) Bias: Bayesian inference

Constraint interaction through numbers The Counting Catastrophe WSP [σH main stress]; M-PARSE Numerical/HG typology: *σH σH ⋯ σH iff #σH’s > n ∀n, OT typology: *σH σH ⋯ σH iff #σH’s > n for n  {1,  } Variation through Losers Relative Harmony predicts relative frequency Harmony Theory/MaxEnt and HG; also Pater’s relativized H in HG (yesterday) The Epenthesis Catastrophe MAX ≫ DEP  /batak/  bataka  but /batak/  batakatatata ≻ /batak/ bata  Fate of harmonically bounded forms?    

Variation through winners of multiple rankings Number of rankings producing an output predicts relative frequency All other approaches (?) Bogus Rankings Catastrophe A, B conflict: Proportion of A-obeying output (A ≫ B) {A; B}: 1/2 X bogus; no (relevant) violations {X ≫ A; B}: 1/3; {A ≫ X; B} : 2/3; {A ≫ X, X ≫ B}: 1 Ordinal, not just quantitative, effects

Analogical Combinatorial Catastrophe Theorem. Lexical Similarity Theory cannot explain Basic Combinatorial Generalization. Lex = {ta, ki}  ti/ka ≻ pa/tu Smolensky, P. 2006. On theoretical facts and empirical abstractions. In Wondering at the natural fecundity of things: Essays in honor of Alan Prince, eds. E. Bakovic, J. Ito, and J. McCarthy. Linguistics Research Center [http://repositories.cdlib.org/lrc/prince/13] & ROA. Role of puzzles, thought experiments & generality of proposed solutions