Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.

Slides:



Advertisements
Similar presentations
WHY LANGUAGES DIFFER IN THE WAY THEY DO ? Traditional thinking HISTORICAL DRIFT. History separates languages in space and time. Lupyan and Dale research.
Advertisements

Second Language Acquisition
Is Recursion Uniquely Human? Hauser, Chomsky and Fitch (2002) Fitch and Hauser (2004)
Introduction: The Chomskian Perspective on Language Study.
1 Language and kids Linguistics lecture #8 November 21, 2006.
Tuesday, May 14 Genetic Algorithms Handouts: Lecture Notes Question: when should there be an additional review session?
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
The dynamics of iterated learning Tom Griffiths UC Berkeley with Mike Kalish, Steve Lewandowsky, Simon Kirby, and Mike Dowman.
Outlines on Language Carolyn R. Fallahi, Ph. D.. Chomsky  Chomsky focused on the nature of human language. When do children begin to understand their.
Charles Taylor - Biology Travis Collier Yoosook Lee Yuan Yao Ed Stabler - Linguistics Greg Kobele Jason Riggle Language and Biology Group.
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
A Bayesian view of language evolution by iterated learning Tom Griffiths Brown University Mike Kalish University of Louisiana.
Introduction to Computational Natural Language Learning Linguistics (Under: Topics in Natural Language Processing ) Computer Science (Under:
Tom Griffiths CogSci C131/Psych C123 Computational Models of Cognition.
Topic: Theoretical Bases for Cognitive Method Objectives Trainees will be able to give reasons for the design and procedures of the Cognitive Method.
Three kinds of learning
LEARNING FROM OBSERVATIONS Yılmaz KILIÇASLAN. Definition Learning takes place as the agent observes its interactions with the world and its own decision-making.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Usage vs Acquisition in Language Change Andrew Wedel and Clay Beckner Language as a Complex System Workshop University of Arizona, 2008.
Why Syntax is Impossible Mike Dowman. Syntax FLanguages have tens of thousands of words FSome combinations of words make valid sentences FOthers don’t.
Generative Grammar(Part ii)
Second language acquisition
Education of English Conversation
Fractal Composition of Meaning: Toward a Collage Theorem for Language Simon D. Levy Department of Computer Science Washington and Lee University Lexington,
Genetic Programming.
Language and Thought.
Communicative Language Teaching (CLT)
Modelling Language Evolution Lecture 2: Learning Syntax Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
MSE 2400 EaLiCaRA Spring 2015 Dr. Tom Way
Turing Machines Chapter Plan Turing Machines(TMs) – Alan Turing Church-Turing Thesis – Definitions Computation Configuration Recognizable vs. Decidable.
Cognitive Development: Language Infants and children face an especially important developmental task with the acquisition of language.
Evolution of Universal Grammar Pia Göser Universität Tübingen Seminar: Sprachevolution Dozent: Prof. Jäger
Evolving a Sigma-Pi Network as a Network Simulator by Justin Basilico.
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
The Communicative Language Teaching Lecture # 18.
Chapter 8 The k-Means Algorithm and Genetic Algorithm.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Issues for Introducing Early Foreign Language Learning No theoretical optimum age to start teaching Early learning of non-mother tongue should be integrated.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
GENETIC ALGORITHM A biologically inspired model of intelligence and the principles of biological evolution are applied to find solutions to difficult problems.
© 2008 The McGraw-Hill Companies, Inc. Chapter 8: Cognition and Language.
Universal Grammar Noam Chomsky.
Artificial Intelligence: Natural Language
1. Describe how Kanzi’s communication skills fulfill each of the four critical properties of language. Kanzi used symbols to represent objects and actions.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Wade/Tavris, (c) 2006, Prentice Hall The Genetics of Similarity Evolution –A change in gene frequencies within a population over many generations; –A mechanism.
CIAR Summer School Tutorial Lecture 1b Sigmoid Belief Nets Geoffrey Hinton.
 B. F. Skinner (operant conditioning, reward-based)  Children learn language through stimulus, response, and reinforcement  Infants learn oral language.
Language Evolution and Change Presented by Brianna Conrey Complex Adaptive Systems Seminar February 27, 2003.
1 Introduction to Language Acquisition Theory Janet Dean Fodor St. Petersburg July 2013 Class 8. Implications and further questions Class 8. Implications.
Approaches to (Second) Language Acquisition. Behaviorism (Theory) tabula rasa (to be filled with language material) children learn language by imitation;
What is Linguistics? «… window to understanding the brain» Pinker. S.( 2012)  Linguistics studies the language(s) – The way how language works language.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
CLT with Grammar Instruction People learned languages before institutionalized education existed. Natural learning processes always assert themselves over.
The Computational Nature of Language Learning and Evolution 10. Variations and Case Studies Summarized by In-Hee Lee
Chapter 3 Language Acquisition: A Linguistic Treatment Jang, HaYoung Biointelligence Laborotary Seoul National University.
Chapter 10 Language acquisition Language acquisition----refers to the child’s acquisition of his mother tongue, i.e. how the child comes to understand.
Grammar and Meaning Lecture 1 History of grammar, two phases: Millennia BC up to 1957 and from present day. 1.
March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.
Ch 5. Language Change: A Preliminary Model 5.1 ~ 5.2 The Computational Nature of Language Learning and Evolution P. Niyogi 2006 Summarized by Kwonill,
Chapter 9. A Model of Cultural Evolution and Its Application to Language From “The Computational Nature of Language Learning and Evolution” Summarized.
Language is common to all humans; we seem to be “hard-wired” for it
LANE 432 Lecture 2 Ch.1.
Defining Language The difference between communication and language is that communication has different modalities while language and speech has to do.
Modelling Language Evolution Lecture 3: Evolving Syntax
Biointelligence Laboratory, Seoul National University
© Richard Goldman October 31, 2006
You will be able to: Explain the beginning of the process of language development. Explain in simple terms Chomsky’s theory of language acquisition.
Presentation transcript:

Modelling Language Evolution Lecture 5: Iterated Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit

Models so far… Models of learning language Models of evolving ability to learn language Models of differing abilities to learn differing languages What do these have in common? The language comes from “outside” LINGUISTIC AGENT LANGUAGE

Neural network Training Sentences Weight settings Two kinds of models Language Acquisition Device Primary Linguistic Data Grammatical Competence What can be learned? What can evolve? LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC

A new kind of model: Iterated Learning LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC LAD PLDGC What kind of language evolves?

What can Iterated Learning explain? My hypothesis: some functional linguistic structure emerges inevitably from the process of iterated learning without the need for natural selection or explicit functional pressure. First target structure: Recursive Compositionality: the meaning of an utterance is some function of the meaning of parts of that utterance and the way they are put together. CompositionalHolistic walkedwent I greet youHi I thought I saw a pussy catchutter

The agent Meaning-signal Pairs in (utterances from parent) Meaning-signal Pairs out (to next generation) Meanings (generated by environment) Learning Algorithm Internal linguistic representation Agent (simulated individual) Production Algorithm Next generation

What will the agents talk about? Need some simple but structured “world”. Simple predicate logic: Agents can string random characters together to form utterances. loves(mary, john) admires(gavin, heather) thinks(mary, likes(john, heather)) knows(heather, thinks(mary, likes(john, heather)))

How do agents learn? Not using neural networks In this model, interested in more traditional, symbolic grammars Learners try and form a grammar that is consistent with the primary linguistic data they hear. Fundamental principle: learning is compression. Learners try and fit data heard, but also generalise Learning is a trade-off between these two

Two steps to learning INCORPORATION (for each sentence heard) GENERALISATION (whenever possible)

A simulation run 1.Start with one learner and one adult speaker neither of which have grammars. 2.Choose a meaning at random. 3.Get speaker to produce signal for that meaning (may need to “invent” random string). 4.Give meaning-signal pair to learner. 5.Repeat 2-4 one hundred and fifty times. 6.Delete speaker. 7.Make learner be the new speaker. 8.Introduce a new learner (with no initial grammar) 9.Repeat 2-8 thousands of times.

Results 1a: initial stages Initially, speakers have no language, so “invent” random strings of characters. A protolanguage emerges for some meanings, but no structure. These are holistic expressions: 1.ldg “Mary admires John” 2.xkq “Mary loves John” 3.gj “Mary admires Gavin” 4.axk “John admires Gavin” 5.gb“John knows that Mary knows that John admires Gavin”

Results 1b: many generations later… 6.gj h f tej m John Mary admires “Mary admires John” 7.gj h f tej wp John Mary loves “Mary loves John” 8.gj qp f tej m Gavin Mary admires “Mary admires Gavin” 9.gj qp f h m Gavin John admires “John admires Gavin” 10.i h u i tej u gj qp f h m John knows Mary knows Gavin John admires “John knows that Mary knows that John admires Gavin”

Quantitative results: languages evolve

What’s going on? There is no biological evolution in the ILM. There isn’t even any communication; no notion of function in model at all. So, why are structured languages evolving? Hypothesis: Languages themselves are evolving to the conditions of the ILM in order that they are learnable. The agents never see all the meanings… Only rules that are generalisable from limited exposure are stable.

Language has to fit through a narrow bottleneck This has profound implications for the structure of language Linguistic competence Linguistic performance Linguistic competence Production Learning

A nice and simple model… Language Meanings: 8-bit binary numbers Signals: 8-bit binary numbers Agents 8x8x8 neural network (not SRN) Learns to associate signals to meanings SIGNALS MEANINGS

Bottleneck Only one parameter in this model The bottleneck: The number of meaning-signal pairs (randomly chosen) given to the next generation… In each simulation, we can measure two things: Expressivity: the proportion of the meaning-space an adult agent can give a unique signal to Instability: how different each generation’s language is to that of the previous generation Subset of meaning signal pairs

Results Bottleneck too tight: unstable and inexpressive language

Results Bottleneck too wide: fairly stable and expressive eventually

Results Medium bottleneck: maximal stability and expressivity

Adaptation Language is evolving to be learnable Structure in mapping emerges Meanings and signals are related by simple rules of bit flipping and re- ordering These rules can be learned from a subset Despite the hugely different model, this is a very similar result to the earlier simulation

Summary Language is learned by individuals with innate learning biases The language data an individual hears is itself the result of learning Languages adapt through iterated learning in response to our innate biases There’s more! Our learning biases adapt through biological evolution in response to the language we use Tomorrow… use a simulation package to look at “grounding” models in an environment Cultural evolution Individual learning Biological evolution