Ec1818 Economics of Discontinuous Change Section 2 [Lectures 5-7]

Slides:



Advertisements
Similar presentations
Local Search Algorithms
Advertisements

LOCAL SEARCH AND CONTINUOUS SEARCH. Local search algorithms  In many optimization problems, the path to the goal is irrelevant ; the goal state itself.
Student : Mateja Saković 3015/2011.  Genetic algorithms are based on evolution and natural selection  Evolution is any change across successive generations.
Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
COMP305. Part II. Genetic Algorithms. Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Chapter 14 Genetic Algorithms.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Algorithm.
Efficient Model Selection for Support Vector Machines
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Schemata Theory Chapter 11. A.E. Eiben and J.E. Smith, Introduction to Evolutionary Computing Theory Why Bother with Theory? Might provide performance.
CS 484 – Artificial Intelligence1 Announcements Lab 3 due Tuesday, November 6 Homework 6 due Tuesday, November 6 Lab 4 due Thursday, November 8 Current.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Spatial management of renewable resources under uncertainty Preliminary results on the economics of coupled Flow, Fish and Fishing Christopher Costello*
Zorica Stanimirović Faculty of Mathematics, University of Belgrade
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Presenter: Chih-Yuan Chou GA-BASED ALGORITHMS FOR FINDING EQUILIBRIUM 1.
Computational Complexity Jang, HaYoung BioIntelligence Lab.
1 Chapter 14 Genetic Algorithms. 2 Chapter 14 Contents (1) l Representation l The Algorithm l Fitness l Crossover l Mutation l Termination Criteria l.
Neural and Evolutionary Computing - Lecture 9 1 Evolutionary Neural Networks Design  Motivation  Evolutionary training  Evolutionary design of the architecture.
Section 3 – Ec1818 Jeremy Barofsky February 17 and 18, 2010.
Edge Assembly Crossover
Optimization Problems
Local Search. Systematic versus local search u Systematic search  Breadth-first, depth-first, IDDFS, A*, IDA*, etc  Keep one or more paths in memory.
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Local Search Algorithms and Optimization Problems
Local Search Algorithms CMPT 463. When: Tuesday, April 5 3:30PM Where: RLC 105 Team based: one, two or three people per team Languages: Python, C++ and.
Genetic Algorithms. Solution Search in Problem Space.
Genetic Algorithms And other approaches for similar applications Optimization Techniques.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
Advanced AI – Session 7 Genetic Algorithm By: H.Nematzadeh.
Constraints Satisfaction Edmondo Trentin, DIISM. Constraint Satisfaction Problems: Local Search In many optimization problems, the path to the goal is.
Local search algorithms In many optimization problems, the path to the goal is irrelevant; the goal state itself is the solution State space = set of "complete"
1 Genetic Algorithms Contents 1. Basic Concepts 2. Algorithm 3. Practical considerations.
Optimization Problems
Chapter 14 Genetic Algorithms.
Optimization via Search
Evolutionary Algorithms Jim Whitehead
CSCI 4310 Lecture 10: Local Search Algorithms
Department of Computer Science
A Comparison of Simulated Annealing and Genetic Algorithm Approaches for Cultivation Model Identification Olympia Roeva.
D.Yu. Ignatov, A.N. Filippov, A.D. Ignatov, X. Zhang
Local Search Algorithms
Example: Applying EC to the TSP Problem
Forecasting The Future of Movies
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Genetic Algorithms, Search Algorithms
CS621: Artificial Intelligence
Case Study: Genetic Algorithms
Optimization Problems
Example: Applying EC to the TSP Problem
Metaheuristic methods and their applications. Optimization Problems Strategies for Solving NP-hard Optimization Problems What is a Metaheuristic Method?
More on Search: A* and Optimization
Stochastic Local Search Variants Computer Science cpsc322, Lecture 16
EE368 Soft Computing Genetic Algorithms.
Boltzmann Machine (BM) (§6.4)
Searching for solutions: Genetic Algorithms
Artificial Intelligence
Machine Learning: UNIT-4 CHAPTER-2
Local Search Algorithms
Traveling Salesman Problem by Genetic Algorithm
Beyond Classical Search
Alex Bolsoy, Jonathan Suggs, Casey Wenner
Local Search Algorithms
Presentation transcript:

Ec1818 Economics of Discontinuous Change Section 2 [Lectures 5-7] Wei Huang Harvard University (Preliminary and subject to revisions) Previous TA section PPTs are used

Lec. 5. Landscape and Kauffman’s N-K Model Some concepts: Fitness landscape is a geometry mapping strategies to profitability. A fitness landscape model is a representation of theory/view of how to boil a set of outcomes into a single metric and of the factors affecting it and how easy or hard it is to change those factors. A landscape is not necessarily concave or single-peaked in the world. Search algorithms explore landscapes. Path is important; uncertainty; no best algorithm for all spaces on a rugged landscape

Lec. 5. Landscape and Kauffman’s N-K Model There are N-dimensional vectors. N = # factors that define your policy. The profitability of each element depends on itself and other K elements. K = # of other elements affecting the profitability of any policy/person; parameter that tunes the ruggedness of landscape. When K=0, it is a concave landscape, single-peaked. When K=1,...,N-2, it is a correlated landscape. When K=N-1, it is a random landscape, possible to have many local maxima. Propositions and Applications (In lec notes, understand them)

Lec 6. Genetic Algorithms GA seeks to discover/create/evolve new strategies that produce better solutions by mimicking evolutionary processes. A search algorithm for correlated but not single-peaked landscapes imitating the evolutionary process. To do a GA, Each strategy is defined by 0/1 for different attributes, The profit is determined by the strategy.

Lec 6. How GA works? Prepare a present population with N strategies. Compute the profit by each strategy in the present population. Create strategies by the following methods Copy: just copy one of the strategies.   Cross-over: take two strategies, split them and cross over. (1001 and 0111 → 1011 and 0101) Mutation: take one strategy and randomly change some of the attributes. (0101 → 0100) Inversion: take one strategy and switch two attributes. (1001 → 1010) Continue 3 until the next population is filled by N strategies. Replace the present population with the next population. Repeat 2-5 for G times and output a strategy with the highest profit as a solution.

Lec 6. Schemata and Fundamental Theorem of GA A schema is a generalized strategy which includes ∗ (this can be 0 or 1). Consider 3 attributes. Then 23 = 8 strategies and 33 = 27 schemata. For example, ∗10 is a schema for 010 and 110. 1∗∗ is a schema for 100, 101, 110 and 111. Average fitness for a schema tells us what combination of 0/1 does well in general. Fundamental Theorem of GA: A GA using fitness-proportionate reproduction and cross-over and mutation produces a population where representatives of schema S grow exponentially proportionate to its fitness relative to the average fitness.

Lec 6. Merits and Problems with GA Can be used in the optimization of non single-peaked and non-differentiable functions; Easy to program, already many applications. Problems: It does not necessarily find the global max; GA gets stuck at the local max if we start from a small group of good (but not the best!) strategies; Need to modify the first population and the probability of mutation. Royal Road Clunker.

Lec 6. Genetic Algorithms and Stimulated Annealing Simulated annealing is an algorithm for going in the wrong direction to get off a local peak and find something better. The probability of going in the wrong direction is P(s) = exp(−δ/t) where δ is the cost (big δ ⇒ small P(s)), and t is the temperature, willingness to go the wrong way (big t ⇒ P(s) close to 1).

Lec 7. Search Models and Stopping Rules There are many reasons why it is impossible to search the entire landscape and find the best one. Cost of Search; You cannot go back to the choice you rejected. Analyze the following models: Reservation wage model. Optimal stopping problems.

Lec 7. Reservation Wage Model Assumptions: You can choose the best among what you searched, but there is a cost of search. Marginal cost of an additional search is given. Procedures: Compute marginal benefit for each search and find the optimal number of searches n with which MB for nth search ≥ MC for nth search, and MB for n+1th search ≤ MC for n+1th search. If MC is constant (or increasing) and MB is decreasing in n, there is a unique optimal number n∗.

Lec 7. Secretary Problem Assumptions: Question: Answer: Why? No explicit search cost, but you cannot go back to choices you rejected. You know the number of choices, but do not know the distribution of values from them. Question: What is the algorithm that maximizes the probability of choosing the one with the highest value? Answer: skipping the first 0.38 ∗ n choices and select the first one whose value is higher than the maximum of the values of the first 0.38 ∗ n choice (for large n). Why? The probability of winning by R discovery searches is 1/n [1 + R/(R+1) + R/(R+2) + … + R/(n-1)]; Max it with respective to R. R* = (n-1)/e

Lec. 7 Thomas Bruss’s Odds-Algorithm for Last-success Problem Throw die 12 times and declare “this is the last 4.” Win if it is and lose if it’s not. What is the best strategy for maximizing the probability of winning? The odds algorithm is one of the solutions.