Modelling Language Evolution Lecture 3: Evolving Syntax

Slides:



Advertisements
Similar presentations
A Brief Overview of Neural Networks By Rohit Dua, Samuel A. Mulder, Steve E. Watkins, and Donald C. Wunsch.
Advertisements

Ch. Eick: More on Machine Learning & Neural Networks Different Forms of Learning: –Learning agent receives feedback with respect to its actions (e.g. using.
Genetic Algorithms1 COMP305. Part II. Genetic Algorithms.
Evolutionary Algorithms Simon M. Lucas. The basic idea Initialise a random population of individuals repeat { evaluate select vary (e.g. mutate or crossover)
September 28, 2010Neural Networks Lecture 7: Perceptron Modifications 1 Adaline Schematic Adjust weights i1i1i1i1 i2i2i2i2 inininin …  w 0 + w 1 i 1 +
November 21, 2012Introduction to Artificial Intelligence Lecture 16: Neural Network Paradigms III 1 Learning in the BPN Gradients of two-dimensional functions:
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Development in hardware – Why? Option: array of custom processing nodes Step 1: analyze the application and extract the component tasks Step 2: design.
Demetris Kennes. Contents Aims Method(The Model) Genetic Component Cellular Component Evolution Test and results Conclusion Questions?
Modelling Language Evolution Lecture 2: Learning Syntax Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Evolutionary Algorithms BIOL/CMSC 361: Emergence Lecture 4/03/08.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Soft Computing Lecture 18 Foundations of genetic algorithms (GA). Using of GA.
What is Evolution? Variation exists in all populations Variation is inherited Evolution is heritable changes in a population over many generations. Descent.
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
Modelling Language Evolution Lecture 1: Introduction to Learning Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit.
Class Notes 4 Mechanisms of evolution. I. Natural Selection happens because of genetic variation. * If a population looks the same, there can be no evolution.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
09/20/04 Introducing Proteins into Genetic Algorithms – CSIMTA'04 Introducing “Proteins” into Genetic Algorithms Virginie LEFORT, Carole KNIBBE, Guillaume.
 Based on observed functioning of human brain.  (Artificial Neural Networks (ANN)  Our view of neural networks is very simplistic.  We view a neural.
Mechanisms of Evolution How does this all work?.
CITS7212: Computational Intelligence An Overview of Core CI Technologies Lyndon While.
Unit 5 Evolution. What is Evolution? Evolution: Microevolution Change in a population’s genetic structure over time Change in: alleles/genotype.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Haploid-Diploid Evolutionary Algorithms
15.2 PDQ.
Introduction to Genetic Algorithms
The Evolution of Antibiotic-Resistance in Bacteria
Supervised Learning in ANNs
What is Evolution??? Learning Target: I can explain Natural Selection and the 4 conditions that are required for Natural Selection to take place.
Mechanisms of Evolution
EVOLUTION.
CS623: Introduction to Computing with Neural Nets (lecture-5)
Biological Evolution.
EA, neural networks & fuzzy systems
V. How Does Evolution Work?
4 (c) parsing.
Haploid-Diploid Evolutionary Algorithms
CS621: Artificial Intelligence
RNNs: Going Beyond the SRN in Language Prediction
Basics of Genetic Algorithms (MidTerm – only in RED material)
Conclusions of Hardy-Weinberg Law
Diversity of Individuals and Evolution of Populations
You gotta be quicker than that
1) What would happen to the population size if the average female produced more than one surviving reproducing daughter? A) there would be more females.
Biology 11 Unit 1: Evolution
General Aspects of Learning
Unit 5 Evolution.
Biology 11 Unit 1: Evolution
Introduction to Evolution
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Genetic Equilibrium Population genetics looks at evolution at the genetic level Types of Evolution: Convergent Evolution Different species evolve similar.
Neural Networks Geoff Hulten.
Capabilities of Threshold Neurons
Basics of Genetic Algorithms
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
CHAPTER I. of EVOLUTIONARY ROBOTICS Stefano Nolfi and Dario Floreano
Learning linguistic structure with simple recurrent neural networks
Populations Change Over Time through Natural Selection
CS623: Introduction to Computing with Neural Nets (lecture-5)
Computer Vision Lecture 19: Object Recognition III
V. How Does Evolution Work?
SURVIVAL OF THE FITTEST
Evolution of Populations
Evidence for Evolution
WATCH Evolution in the Galapagos
Natural Selection in Action
CS621: Artificial Intelligence Lecture 18: Feedforward network contd
Presentation transcript:

Modelling Language Evolution Lecture 3: Evolving Syntax Simon Kirby University of Edinburgh Language Evolution & Computation Research Unit

Evolving the ability to learn syntax (Batali 1994) A “standard” recurrent network does not seem to be able to learn syntax without some help Elman provides this “help” via incremental memory The network comes pre-setup to help it learn syntax i.e., our model of an individual is born with a working memory that grows over time Does this correspond to an innate prespecification for language learning?

Where do innate abilities come from? If an organism has some innate predisposition… … and that predisposition is functional, how do we explain it? Darwinian natural selection seems appropriate. Could we model natural selection? Can we evolve a syntax learner? (as opposed to building one by hand?)

What things about a network could be innate? Many features of networks could be thought of as innately determined… The length of time before context units are blanked The shape of the activation function The number of nodes in the hidden layer … Batali suggests: the initial connection weights. Normally, these are random – but what if they were specified by genes?

How to model an organism GENOTYPE 0.2 -1.3 0.05 0.9 -0.5 0.001 0.1 development PHENOTYPE The model has genes, which are expressed as a phenotype. The phenotype is simply the initial state of a network (before learning).

How to evolve organisms Crucial aspects of evolution: A population of organisms (with varying phenotypes) A task which they are trying to succeed at A measure of how fit they are at this task A way of selecting the fittest A way of allowing the genes of the fittest to survive A mechanism for introducing variation into the gene pool Various techniques to model all of this (i.e., Genetic Algorithms, Artificial Life etc.)

Batali’s model of evolution Each organism (or agent) has its weights set by genes The agents then trained on some language The agents’ error is used to assign fitness Only the top third of the population is kept The top third have their weights reset to what their genes specify Each agent “gives birth” to two new agents with approximately the same genes (i.e., genes are mutated) Go to step 1.

The language task One of the simplest languages that involves embedding is ab aabb aaaaaaaabbbbbbbb *aaaaaaaaabbbbbbbb What machinery would you need to recognise strings from this language? Minimally – a simple counter Can an SRN with random initial weights learn this language?

Performance of a trained (but non-evolved) net Networks fail to learn to count (although some aspects of the language are learnt).

Evolving a better network Batali used a population of 24 nets (initially with genes specifying random weights) Evolved using a fitness function based on ability at after training After 150 generations, the networks were better at learning the task They evolved initial weight settings that made learning syntax possible

Evolved network performance b sp rec

Issues that remain… What is learning doing? If language is always the same, the networks could eventually end up with the whole thing innate (and not need learning at all!) What would happen if the networks were trained on a class of languages? Initial weights are a different type of innateness than Elman’s. Can Batali also explain the critical period?

Is evolution just the same as learning? We can think of a fitness landscape just like an error surface What are the differences? Does evolution do gradient descent?