March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution.

Slides:



Advertisements
Similar presentations
Intelligent Control Methods Lecture 12: Genetic Algorithms Slovak University of Technology Faculty of Material Science and Technology in Trnava.
Advertisements

Using Parallel Genetic Algorithm in a Predictive Job Scheduling
Tetris – Genetic Algorithm Presented by, Jeethan & Jun.
Tetris and Genetic Algorithms Math Club 5/30/2011.
Genetic Algorithms By: Anna Scheuler and Aaron Smittle.
On the Genetic Evolution of a Perfect Tic-Tac-Toe Strategy
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
1 Chapter 13 Artificial Life: Learning through Emergent Behavior.
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Genetic Algorithms GAs are one of the most powerful and applicable search methods available GA originally developed by John Holland (1975) Inspired by.
Evolutionary Algorithms Simon M. Lucas. The basic idea Initialise a random population of individuals repeat { evaluate select vary (e.g. mutate or crossover)
Genetic Algorithms Learning Machines for knowledge discovery.
Artificial Intelligence Genetic Algorithms and Applications of Genetic Algorithms in Compilers Prasad A. Kulkarni.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
7/2/2015Intelligent Systems and Soft Computing1 Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Genetic Programming. Agenda What is Genetic Programming? Background/History. Why Genetic Programming? How Genetic Principles are Applied. Examples of.
Ocober 10, 2012Introduction to Artificial Intelligence Lecture 9: Machine Evolution 1 The Alpha-Beta Procedure Example: max.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Programming.
Slides are based on Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming n Evolution.
Evolutionary Intelligence
© Negnevitsky, Pearson Education, CSC 4510 – Machine Learning Dr. Mary-Angela Papalaskari Department of Computing Sciences Villanova University.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
Introduction to Genetic Algorithms and Evolutionary Computation
A Comparison of Nature Inspired Intelligent Optimization Methods in Aerial Spray Deposition Management Lei Wu Master’s Thesis Artificial Intelligence Center.
Othello Artificial Intelligence With Machine Learning
Evolution Strategies Evolutionary Programming Genetic Programming Michael J. Watts
CS 484 – Artificial Intelligence1 Announcements Lab 4 due today, November 8 Homework 8 due Tuesday, November 13 ½ to 1 page description of final project.
Genetic algorithms Prof Kang Li
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Genetic Algorithms Michael J. Watts
Boltzmann Machine (BM) (§6.4) Hopfield model + hidden nodes + simulated annealing BM Architecture –a set of visible nodes: nodes can be accessed from outside.
What is Genetic Programming? Genetic programming is a model of programming which uses the ideas (and some of the terminology) of biological evolution to.
1 Chapter 13 Artificial Life: Learning through Emergent Behavior.
Genetic Algorithms Introduction Advanced. Simple Genetic Algorithms: Introduction What is it? In a Nutshell References The Pseudo Code Illustrations Applications.
© Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms Introduction, or can evolution be intelligent? Introduction,
Artificial Intelligence Chapter 4. Machine Evolution.
Artificial Intelligence for Games Online and local search
 Negnevitsky, Pearson Education, Lecture 9 Evolutionary Computation: Genetic algorithms n Introduction, or can evolution be intelligent? n Simulation.
Evolutionary Programming
Innovative and Unconventional Approach Toward Analytical Cadastre – based on Genetic Algorithms Anna Shnaidman Mapping and Geo-Information Engineering.
7주 강의 Machine Evolution.
Genetic Algorithms MITM613 (Intelligent Systems).
Othello Artificial Intelligence With Machine Learning Computer Systems TJHSST Nick Sidawy.
Genetic Algorithms Chapter Description of Presentations
Neural Networks And Its Applications By Dr. Surya Chitra.
Application of the GA-PSO with the Fuzzy controller to the robot soccer Department of Electrical Engineering, Southern Taiwan University, Tainan, R.O.C.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
1 Comparative Study of two Genetic Algorithms Based Task Allocation Models in Distributed Computing System Oğuzhan TAŞ 2005.
Evolutionary Programming A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Chapter 5.
A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Evolutionary Programming.
An application of the genetic programming technique to strategy development Presented By PREMKUMAR.B M.Tech(CSE) PONDICHERRY UNIVERSITY.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
GENETIC ALGORITHM By Siti Rohajawati. Definition Genetic algorithms are sets of computational procedures that conceptually follow steps inspired by the.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Genetic Algorithm (Knapsack Problem)
Genetic Algorithms.
An Evolutionary Approach
Evolutionary Algorithms Jim Whitehead
Evolution Strategies Evolutionary Programming
School of Computer Science & Engineering
Artificial Intelligence Chapter 4. Machine Evolution
Introduction to Artificial Intelligence Lecture 11: Machine Evolution
Artificial Intelligence Chapter 4. Machine Evolution
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Beyond Classical Search
Presentation transcript:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 1 Let’s look at… Machine Evolution

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 2 Machine Evolution As you will see later in this course, neural networks can “learn”, that is, adapt to given constraints. For example, NNs can approximate a given function. In biology, such learning corresponds to the learning by an individual organism. However, in nature there is a different type of adaptation, which is achieved by evolution. Can we use evolutionary mechanisms to create learning programs?

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 3 Machine Evolution Fortunately, on our computer we can simulate evolutionary processes faster than in real-time. We simulate the two main aspects of evolution: Generation of descendants that are similar but slightly different from their parents, Generation of descendants that are similar but slightly different from their parents, Selective survival of the “fittest” descendants, i.e., those that perform best at a given task. Selective survival of the “fittest” descendants, i.e., those that perform best at a given task. Iterating this procedure will lead to individuals that are better and better at the given task.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 4 Machine Evolution Let us say that we wrote a computer vision algorithm that has two free parameters x and y. We want the program to “learn” the optimal values for these parameters, that is, those values that allow the program to recognize objects with maximum probability p. To visualize this, we can imagine a 3D “landscape” defined by p as a function of x and y. Our goal is to find the highest peak in this landscape, which is the maximum of p.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 5 Machine Evolution We can solve this problem with an evolutionary approach. Any variant of the program is completely defined by its values of x and y and can thus be found somewhere in the landscape. We start with a random population of programs. Now those individuals at higher elevations, who perform better, get a higher chance of reproduction than those in the valleys.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 6 Machine Evolution Reproduction can proceed in two different ways: Production of descendants near the most successful individuals (“single parents”) Production of descendants near the most successful individuals (“single parents”) Production of new individuals by pairs of successful parents. Here, the descendants are placed somewhere between the parents. Production of new individuals by pairs of successful parents. Here, the descendants are placed somewhere between the parents.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 7 Machine Evolution The fitness (or performance) of a program is then a function of its parameters x and y:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 8 Machine Evolution The initial random population of programs could look like this:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 9 Machine Evolution Only the most successful programs survive…

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 10 Machine Evolution … and generate children that are similar to themselves, i.e., close to them in parameter space:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 11 Machine Evolution Again, only the best ones survive and generate offspring:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 12 Machine Evolution … and so on… … until the population approaches maximum fitness.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 13 Genetic Programming Instead of just varying a number of parameters, we can evolve complete programs (genetic programming). Let us evolve a wall-following robot in grid-space world. The robot’s behavior is determined by a LISP function. We use four primitive functions: AND(x, y) = 0 if x = 0; else y AND(x, y) = 0 if x = 0; else y OR(x, y) = 1 if x = 1; else y OR(x, y) = 1 if x = 1; else y NOT(x) = 0 if x = 1; else 1 NOT(x) = 0 if x = 1; else 1 IF(x, y, z) = y if x = 1; else z IF(x, y, z) = y if x = 1; else z

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 14 Genetic Programming The robot receives sensory inputs n, ne, e, se, s, sw, w, and nw. These inputs are 0 whenever the corresponding cell is free, otherwise they are 1. The robot can move either north, east, south, or west. In genetic programming, we must make sure that all syntactically possible expressions in a program are actually defined and do not crash our system.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 15 Genetic Programming We start with a population of 5000 randomly created programs and let them perform. We let the robot start ten times, each time starting in a different position. Each time, we let the robot perform 60 steps and count the number of different cells adjacent to a wall that the robot visits. There are 32 such cells, so our fitness measure ranges from 0 (lowest fitness) to 320 (highest fitness).

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 16 Genetic Programming Example for a perfect wall-following robot program in LISP:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 17 Genetic Programming The best-performing program among the 5000 randomly generated ones:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 18 Genetic Programming In generation i + 1, 500 individuals are directly copied from generation i 500 individuals are directly copied from generation i 4500 are created by crossover operations between two parents chosen from the 500 winners are created by crossover operations between two parents chosen from the 500 winners. In about 50 cases, mutation is performed. In about 50 cases, mutation is performed.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 19 Genetic Programming Example for a crossover operation: Mutation is performed by replacing a subtree of a program with a randomly created subtree.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 20 Genetic Programming After six generations, the best program behaves like this:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 21 Genetic Programming And after ten generations, we already have a perfect program (fitness 320):

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 22 Genetic Programming Here is a diagram showing the maximum fitness as a function of the generation number:

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 23 Game Player Evolution You could simulate an evolutionary process to improve your Isola playing algorithm. The easiest way to do this would be to use evolutionary learning to find the optimal weight vector in your static evaluation function, i.e., optimal weighting for each evaluation feature that you compute. For example, assume that you are using the features f 1 (number of neighboring squares) and f 2 (number of reachable squares). In each case, you actually use the difference between the value for yourself and the value for your opponent.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 24 Game Player Evolution Then you could use weights w 1 and w 2 to compute your evaluation function: e(p) = w 1 f 1 + w 2 f 2 So the performance of your algorithm will depend on the weights w 1 and w 2. This corresponds to the example of the computer vision algorithm with two free parameters. Thus you could use an evolutionary process to find the best values for w 1 and w 2 just like in that example.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 25 Game Player Evolution But how can you determine which individuals survive and procreate? Well, one possibility would be to hold a tournament in which all individuals compete (or many smaller tournaments), and only the best n individuals will reach the next generation, i.e., the next tournament. The other individuals are deleted and replaced with new individuals that use similar weights as the winners. This way you will evolve algorithms of better and better performance, or in other words, you will approach the best values for w 1 and w 2.

March 1, 2016Introduction to Artificial Intelligence Lecture 11: Machine Evolution 26 Game Player Evolution You could slightly modify the game code to implement this principle of evolution. When you have obtained the best values for w 1 and w 2 (or in your case maybe w 1, w 2, …, w 37 ), just transfer these values into your original program. Your program should now play significantly better than it did prior to its evolutionary improvement. Try it out!