Summarized by In-Hee Lee

Slides:



Advertisements
Similar presentations
Tangent Vectors and Normal Vectors. Definitions of Unit Tangent Vector.
Advertisements

Solving Linear Systems (Numerical Recipes, Chap 2)
Hidden Markov Models Fundamentals and applications to bioinformatics.
Multiscale Profile Analysis Joshua Stough MIDAG PI: Chaney, Pizer July, 2003 Joshua Stough MIDAG PI: Chaney, Pizer July, 2003.
Welcome to Data Analysis and Interpretation
Ch 9.5: Predator-Prey Systems In Section 9.4 we discussed a model of two species that interact by competing for a common food supply or other natural resource.
M11-Normal Distribution 1 1  Department of ISM, University of Alabama, Lesson Objective  Understand what the “Normal Distribution” tells you.
Geometric Modeling for Shape Classes Amitabha Mukerjee Dept of Computer Science IIT Kanpur
EVOLUTIONARY HMMS BAYESIAN APPROACH TO MULTIPLE ALIGNMENT Siva Theja Maguluri CS 598 SS.
Chapter 1. Introduction From “The Computational Nature of Language Learning and Evolution” Summarized by Seok Ho-Sik.
Copyright © 2013, 2010 and 2007 Pearson Education, Inc. Chapter The Normal Probability Distribution 7.
Optimal Path Planning Using the Minimum-Time Criterion by James Bobrow Guha Jayachandran April 29, 2002.
Sporadic model building for efficiency enhancement of the hierarchical BOA Genetic Programming and Evolvable Machines (2008) 9: Martin Pelikan, Kumara.
Learning Kernel Classifiers 1. Introduction Summarized by In-Hee Lee.
Sampling Distributions. Terms P arameter - a number (usually unknown) that describes a p opulation. S tatistic – a number that can be computed from s.
Kevin Stevenson AST 4762/5765. What is MCMC?  Random sampling algorithm  Estimates model parameters and their uncertainty  Only samples regions of.
The Computational Nature of Language Learning and Evolution 10. Variations and Case Studies Summarized by In-Hee Lee
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
Dynamic Network Analysis Case study of PageRank-based Rewiring Narjès Bellamine-BenSaoud Galen Wilkerson 2 nd Second Annual French Complex Systems Summer.
LECTURE 9. Genetic drift In population genetics, genetic drift (or more precisely allelic drift) is the evolutionary process of change in the allele frequencies.
Ch 4. Language Acquisition: Memoryless Learning 4.1 ~ 4.3 The Computational Nature of Language Learning and Evolution Partha Niyogi 2004 Summarized by.
Clustering (2) Center-based algorithms Fuzzy k-means Density-based algorithms ( DBSCAN as an example ) Evaluation of clustering results Figures and equations.
Ch 5. Language Change: A Preliminary Model 5.1 ~ 5.2 The Computational Nature of Language Learning and Evolution P. Niyogi 2006 Summarized by Kwonill,
Chapter 9. A Model of Cultural Evolution and Its Application to Language From “The Computational Nature of Language Learning and Evolution” Summarized.
1 One goal in most meta-analyses is to examine overall, or typical effects We might wish to estimate and test the value of an overall (general) parameter.
Sampling Distributions
Chapter 14. Conclusions From “The Computational Nature of Language Learning and Evolution” Summarized by Seok Ho-Sik.
4.3 Constant-Coefficient Systems. Phase Plane Method
Biointelligence Laboratory, Seoul National University
Boyce/DiPrima 9th ed, Ch 9.4: Competing Species Elementary Differential Equations and Boundary Value Problems, 9th edition, by William E. Boyce and.
Sampling Distributions and Estimators
Basic Estimation Techniques
Alexandra Balogh and Olof Leimar
Dimension Review Many of the geometric structures generated by chaotic map or differential dynamic systems are extremely complex. Fractal : hard to define.
Particle Filtering for Geometric Active Contours
Sample Mean Distributions
The Computational Nature of Language Learning and Evolution
Clustering (3) Center-based algorithms Fuzzy k-means
Forecasting Population Size
Markov chain monte carlo
BGP update profiles and the implications for secure BGP update validation processing Geoff Huston PAM April 2007.
Lecture Slides Elementary Statistics Thirteenth Edition
Basic Estimation Techniques
KEY CONCEPT Populations, not individuals, evolve.
Hidden Markov Models Part 2: Algorithms
Root-Locus Analysis (1)
Amblard F.*, Deffuant G.*, Weisbuch G.** *Cemagref-LISC **ENS-LPS
Distribution of the Sample Proportion
Section 17.1 Parameterized Curves
Sampling Distribution
Sampling Distribution
Environmental and Exploration Geophysics I
Chapter 11: Inference for Distributions of Categorical Data
KEY CONCEPT Populations, not individuals, evolve.
Modern Control Systems (MCS)
Hypothesis Tests Regarding a Parameter
KEY CONCEPT Populations, not individuals, evolve.
Sampling Distribution of a Sample Proportion
Suppose that the random variable X has a distribution with a density curve that looks like the following: The sampling distribution of the mean of.
Ch 6. Language Change: Multiple Languages 6.1 Multiple Languages
Boyce/DiPrima 10th ed, Ch 7.6: Complex Eigenvalues Elementary Differential Equations and Boundary Value Problems, 10th edition, by William E. Boyce and.
KEY CONCEPT Populations, not individuals, evolve.
Chapter 5 Language Change: A Preliminary Model (2/2)
Continuous Random Variables 2
Natural selection in populations & Other mechanisms of Evolution
Section 2.3: Polynomial Functions of Higher Degree with Modeling
The student is expected to: 7C analyze and evaluate how natural selection produces change in populations, not individuals; 7D analyze and evaluate how.
Chapter 4 . Trajectory planning and Inverse kinematics
Biointelligence Laboratory, Seoul National University
KEY CONCEPT Populations, not individuals, evolve.
Presentation transcript:

Summarized by In-Hee Lee 2009. 07. 31. The Computational Nature of Language Learning and Evolution 6. Language Changes – Multiple Languages 6.2 Example 1: A Three Parameter System Summarized by In-Hee Lee 2009. 07. 31.

Overview for learning system Population at t … L1 L4 L2 L7 P1 P4 P2 P7 Child Population … Sampled sentences are presented TLA L4 TLA L2 TLA L2 TLA L5 Population at t + 1 …

A Learning System Triple G : three-parameter system -> 8 possible languages. A : memoryless algorithm -> TLA ±single value constraint ± greediness constraint {Pi}: sentence distribution of language i -> uniform on degree-0 sentences. Finite sample sentences (128) to each learner.

6.2.1 Homogeneous Initial Populations 4 variations on learning algorithms TLA(+Single Value, +Greedy) TLA(– Single Value, +Greedy) TLA(±Single Value, – Greedy)

6.2.1 Homogeneous Initial Populations Variation 1: TLA(+Single Value, +Greedy) All +V2 languages are relatively stable. Populations speaking –V2 language all drift to +V2 languages. The rates at which the linguistic composition change vary from language to language. A homogeneous population can split into different linguistic groups. The observed instability and drifts are from the learning algorithm (to some extent).

Compare the rate of change (drift) L1 → L2 L5 → L2

6.2.1 Homogeneous Initial Populations Variation 2: TLA(–Single Value, +Greedy) L4, L6 L8 All populations eventually drift to a similar population mixture. (Why? is left as open question.) All populations drift to a population mixture of only +V2 languages (L2, L4, L6, L8).

6.2.1 Homogeneous Initial Populations Variation 3,4: TLA(±Single Value, –Greedy) Dropping greediness constraint 3. single step constraint without greediness: choose any new state that is at most one parameter away from the current one. 4. without both single step and greediness: choose any new state at each step. Both yield dynamical systems that arrive at the same population mixture after 30 generations. Final mixture contains all languages in significant proportion. No languages has disappeared.

With single step Without single step Both yield dynamical systems that arrive at the same population mixture after 30 generations. But the path is slightly different.

6.2.2 Modeling Diachronic Trajectories Diachronic population trajectories open take “S-shape.” From the point of view of this book, the trajectories might or might not be S-shaped, and might have varying rate of change. Other factors that affect evolutionary trajectories Maturation time : the number of sentences available to learner before it internalize its adult grammar. Probability distribution : the distribution according to which sentences are presented to the learner.

6.2.2 Modeling Diachronic Trajectories The effect of maturation time or sample size TLA(–Single Value, +Greedy) Sample size from 8, 16, 32, 64, 128, 256 Initial rate of change is highest when the maturation time is smallest. The stable population composition depends on maturation time. The trajectories do not have an S-shape curve. Sample size increasing L2 proportion

6.2.2 Modeling Diachronic Trajectories The effect of sentence distributions TLA(+Single Value, +Greedy) An example focusing on the interaction between L1 and L2 L1 L2 1-p p We can drive the population towards certain direction by changing sentence distributions. We can examine the conditions needed to generate such change.

6.2.3 Nonhomogeneous populations: Phase-space plots Phase-space plot based on the relationship between the state of the population in one generation and the next. (t): 8 dimensional vector corresponding to the proportion of each language users. L2 Phase-space plot for TLA(– Single Value, + Greedy) : Projection to (L1, L2) plane L1

6.2.3 Nonhomogeneous populations: Phase-space plots Stability issues Many initial conditions yield trajectories that seem to converge to a single point in state space. What are the conditions for stability? How many fixed points are there? How can we solve for them? The equations to characterize the stable population mix.

6.2.3 Nonhomogeneous populations: Phase-space plots The equations to characterize the stable population mix. (from Section 6.1) 1 2 3

6.2.3 Nonhomogeneous populations: Phase-space plots By letting (t + 1) = (t), we get