Biointelligence Laboratory, Seoul National University

Slides:



Advertisements
Similar presentations
WHY LANGUAGES DIFFER IN THE WAY THEY DO ? Traditional thinking HISTORICAL DRIFT. History separates languages in space and time. Lupyan and Dale research.
Advertisements

Evolution and Complex Structures: Simulated Evolution Hints at Features? Eric Duchon March 17, 2008.
I NNATIST HYPOTHESIS, (UG) Second language acquisition.
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Copyright © Cengage Learning. All rights reserved.
The theoretical significance of UG in second language acquisition Present by Esther, Karen, Mei.
Theories of Second language Acquisition
1 Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Models of Generative Grammar Smriti Singh. Generative Grammar  A Generative Grammar is a set of formal rules that can generate an infinite set of sentences.
Linguistic Theory Lecture 3 Movement. A brief history of movement Movements as ‘special rules’ proposed to capture facts that phrase structure rules cannot.
Emergence of Syntax. Introduction  One of the most important concerns of theoretical linguistics today represents the study of the acquisition of language.
An Introduction to Geology Chapter 1.  Geology is the science that pursues an understanding of planet Earth ▪ Physical geology – examines the materials.
Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
What is linguistics  It is the science of language.  Linguistics is the systematic study of language.  The field of linguistics is concerned with the.
Copyright © Cengage Learning. All rights reserved. CHAPTER 4 ELEMENTARY NUMBER THEORY AND METHODS OF PROOF ELEMENTARY NUMBER THEORY AND METHODS OF PROOF.
Using a Story-Based Approach to Teach Grammar
1 S ystems Analysis Laboratory Helsinki University of Technology Flight Time Allocation Using Reinforcement Learning Ville Mattila and Kai Virtanen Systems.
Chapter 8 Sampling Variability and Sampling Distributions.
Chapter 1. Introduction From “The Computational Nature of Language Learning and Evolution” Summarized by Seok Ho-Sik.
Progressive Sampling Instance Selection and Construction for Data Mining Ch 9. F. Provost, D. Jensen, and T. Oates 신수용.
Statistical Tests We propose a novel test that takes into account both the genes conserved in all three regions ( x 123 ) and in only pairs of regions.
The Computational Nature of Language Learning and Evolution 10. Variations and Case Studies Summarized by In-Hee Lee
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
Ch 4. Language Acquisition: Memoryless Learning 4.1 ~ 4.3 The Computational Nature of Language Learning and Evolution Partha Niyogi 2004 Summarized by.
Ch 2. The Probably Approximately Correct Model and the VC Theorem 2.3 The Computational Nature of Language Learning and Evolution, Partha Niyogi, 2004.
Chapter 9. The PlayMate System ( 2/2 ) in Cognitive Systems Monographs. Rüdiger Dillmann et al. Course: Robots Learning from Humans Summarized by Nan Changjun.
Ch 5. Language Change: A Preliminary Model 5.1 ~ 5.2 The Computational Nature of Language Learning and Evolution P. Niyogi 2006 Summarized by Kwonill,
Chapter 9. A Model of Cultural Evolution and Its Application to Language From “The Computational Nature of Language Learning and Evolution” Summarized.
Glottodidactics Lesson 7.
Chapter 14. Conclusions From “The Computational Nature of Language Learning and Evolution” Summarized by Seok Ho-Sik.
Biointelligence Laboratory, Seoul National University
Chapter 1 Introduction: Themes in the Study of Life
What is Language Acquisition?
2nd Language Learning Chapter 2 Lecture 4.
Summarized by In-Hee Lee
Pole Placement and Decoupling by State Feedback
Biointelligence Laboratory, Seoul National University
CHAPTER 11 Inference for Distributions of Categorical Data
OTHER MODELS OF TURING MACHINES
The Computational Nature of Language Learning and Evolution
Ch 14. Active Vision for Goal-Oriented Humanoid Robot Walking (1/2) Creating Brain-Like Intelligence, Sendhoff et al. (eds), Robots Learning from.
Speaker: Jim-an tsai advisor: professor jia-lin koh
BGP update profiles and the implications for secure BGP update validation processing Geoff Huston PAM April 2007.
Chapter 5.
CHAPTER 11 Inference for Distributions of Categorical Data
Detecting evolutionary forces in language change (2017)
Biointelligence Laboratory, Seoul National University
Preplanning Presentation
Making Progress in Multiplication
CHAPTER 11 Inference for Distributions of Categorical Data
Unit 6: Conservation is the law Lab Book Set-Up
Analyzing the Association Between Categorical Variables
Chapter 1 Q: Explain SLA.
EE368 Soft Computing Genetic Algorithms.
Guideline Try to summarize to express explicitly
Copyright © Cengage Learning. All rights reserved.
CHAPTER 11 Inference for Distributions of Categorical Data
Ch 6. Language Change: Multiple Languages 6.1 Multiple Languages
Chapter 5 Language Change: A Preliminary Model (2/2)
Chapter 1 Q: Explain SLA.
CHAPTER 11 Inference for Distributions of Categorical Data
Biointelligence Laboratory, Seoul National University
CHAPTER 11 Inference for Distributions of Categorical Data
Copyright © Cengage Learning. All rights reserved.
Biointelligence Laboratory, Seoul National University
CHAPTER 11 Inference for Distributions of Categorical Data
Copyright © Cengage Learning. All rights reserved.
CHAPTER 11 Inference for Distributions of Categorical Data
CHAPTER 11 Inference for Distributions of Categorical Data
CHAPTER 11 Inference for Distributions of Categorical Data
Presentation transcript:

Biointelligence Laboratory, Seoul National University Ch 6. Language Change: Multiple Languages 6.3 ~ 6.4 The computational nature of language learning and evolution Summarized by Kim, S. –J. Biointelligence Laboratory, Seoul National University http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Contents 6.3 Example 2: Syntactic change in French 6.3.1 The parametric subspace and data 6.3.2 The case of diachronic syntactic change in French 6.3.3 Some dynamical system simulations 6.4 Conclusion (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

The parametric subspace and data We consider a different parametric linguistic system studied by Clark and Roberts (1993). The historical context in which Clark and Roberts advanced their linguistic proposal is the evolution of Modern French from Old French. Consider a syntactic space requiring five parameters. Five parameters define a space with 32 grammars. Each grammar in this parameterized system can be represented by a string of five bits depending upon the values of p1, …, p5. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

The parametric subspace and data For the purpose of explaining how Old French evolved into Modern French, Clark and Roberts consider a list of key sentences provided in Table 6.4. The parameter settings required to generate each sentence are provided in brackets; an asterisk is a “doesn’t matter” value and an “X” means any phrase. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

The parametric subspace and data For example, the sentence from “adv V S” is generated by all grammars that have case assignment under government and verb-second movement. The other parameters can be set to any value. There are eight different grammars that can generate this sentence. As a result, the grammars do not all have unique extensional properties, i.e., Some pairs of grammars generate the same set of sentences and are weakly equivalent in this setting. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Some dynamical system simulation Using a TLA (triggering learning algorithm)-like algorithm as a starting point, we can obtain a dynamical system characterization for the linguistic population in this parametric space. For illustrative purposes, we show the results of two simulations conducted with such dynamical systems. Homogeneous populations (Initial state: Old French) Heterogeneous populations (Mixtures) (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Homogeneous populations Our goal will be to see if misconvergence by learners can be a sufficient reason to drive a population from speakers of Old French to speakers of Modern French. We start the simulation with a homogeneous population speaking Old French (parameter setting = 11011). Observation: It is worth noting the weak equivalence and subset relations between the different grammars to which the population converges. The grammar (11011) is in a subset relation with (11111). This explains why after a few generations most of the population switches to either (11111) or (11110) (the superset grammars). An interesting feature of the simulation is that 15% of the population eventually acquires the grammar (11110), i.e., they have lost the V2 parameter setting. Recall that for such algorithms, the V2 parameter was very stable in the previously conducted simulations using a three-parameter system. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Homogeneous populations It is observed that in one generation, 15% of the children converge to grammar 01011 18% to grammar 01111 33% to grammar 11011 26% to grammar 11111 Notice that in the space of the a few generations, the speakers of 11011 and 01011 have dropped out altogether. Most of population speaks language 11111 (46%) and 01111 (27%). (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Heterogeneous populations In order to more closely duplicate this historically observed trajectory, we start the simulations with an initial condition that is a mixture of two sources: data from Old French and New French Observations: Although small amounts of data from a Modern French source cause disproportionate change in the V2 parameter setting. Suppose the learner is exposed to sentences where 90% are generated by the grammar of Old French (11011) and 10% by the grammar of Modern French (10100). The result shows 22% of the learners converged to grammar (11110) and 78% to the grammar (11111) within one generation. The new population remains stable forever. If the dynamical system is allowed to evolve, it ends up in either of the two states (11111) or (11110). This is essentially due to the subset relations these languages have with other languages in the system. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

Heterogeneous populations Figure 6.11 shows the proportion of speakers who have lost V2 after one generation, as a function of the proportion of sentences from the Modern French source. For small values of the proportion of the Modern French source, the slope of the curve is greater that 1. There is a greater tendency of speakers to lose V2 than to retain it. A small change in inputs provided to children could drive the system toward larger change. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

General insights and future directions Two different scenarios for plausible change of a population of Old French speakers to one of modern French speakers. The first scenario embodies the hypothesis that the change is internally driven, with misconverging speakers changing the linguistic composition over generational time. The second scenario follows the hypothesis that the change is driven by language contact, with two different linguistic types in contact with each other and the population drifting gradually from one to the other. Neither of the two simulations is able to replicate the change from Old to Modern French. These two scenarios compel the historical linguist to articulate the explanations for change in a precise manner and fail to explain the complete change from Old to Modern French. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/

(C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/ Conclusions In this chapter, we considered the general n-language setting where n different linguistic types are potentially present in the population at all times. To concretely demonstrate that the grammatical dynamical systems need not be impossibly difficult to simulate, we explicitly showed how to transform parameterized theories, and memoryless learning algorithms into dynamical systems. (C) 2009, SNU Biointelligence Lab, http://bi.snu.ac.kr/