Calibration/Optimisation Dr Andy Evans. Preparing to model Verification Calibration/Optimisation Validation Sensitivity testing and dealing with error.

Slides:



Advertisements
Similar presentations
Reinforcement Learning
Advertisements

Local Search Algorithms
Artificial Intelligence Presentation
Genetic Algorithms (Evolutionary Computing) Genetic Algorithms are used to try to “evolve” the solution to a problem Generate prototype solutions called.
Artificial Intelligence in Game Design Introduction to Learning.
Biologically Inspired AI (mostly GAs). Some Examples of Biologically Inspired Computation Neural networks Evolutionary computation (e.g., genetic algorithms)
Genetic Programming 김용덕 Page 2 Contents What is Genetic Programming? Difference between GP and GA Flowchart for GP Structures in GP.
Spie98-1 Evolutionary Algorithms, Simulated Annealing, and Tabu Search: A Comparative Study H. Youssef, S. M. Sait, H. Adiche
1 Lecture 8: Genetic Algorithms Contents : Miming nature The steps of the algorithm –Coosing parents –Reproduction –Mutation Deeper in GA –Stochastic Universal.
Evolutionary Computational Intelligence
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
A new crossover technique in Genetic Programming Janet Clegg Intelligent Systems Group Electronics Department.
1 Genetic Algorithms. CS The Traditional Approach Ask an expert Adapt existing designs Trial and error.
Intelligent Agents What is the basic framework we use to construct intelligent programs?
Intro to AI Genetic Algorithm Ruth Bergman Fall 2002.
Review Best-first search uses an evaluation function f(n) to select the next node for expansion. Greedy best-first search uses f(n) = h(n). Greedy best.
Simulation Models as a Research Method Professor Alexander Settles.
D Nagesh Kumar, IIScOptimization Methods: M1L4 1 Introduction and Basic Concepts Classical and Advanced Techniques for Optimization.
Intro to AI Genetic Algorithm Ruth Bergman Fall 2004.
16 November, 2005 Statistics in HEP, Manchester 1.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Genetic Algorithms Overview Genetic Algorithms: a gentle introduction –What are GAs –How do they work/ Why? –Critical issues Use in Data Mining –GAs.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Genetic Programming.
Slides are based on Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming n Evolution.
Multiple Sequence Alignment CSC391/691 Bioinformatics Spring 2004 Fetrow/Burg/Miller (Slides by J. Burg)
Genetic Algorithm.
Evolutionary Intelligence
Programming for Geographical Information Analysis: Advanced Skills Lecture 10: Modelling II: The Modelling Process Dr Andy Evans.
Local Search and Optimization
The Modelling Process Dr Andy Evans. This lecture The modelling process: Identify interesting patterns Build a model of elements you think interact and.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
Calibration/Optimisation Dr Andy Evans. Preparing to model Verification Calibration/Optimisation Validation Sensitivity testing and dealing with error.
© Negnevitsky, Pearson Education, Lecture 10 Evolutionary Computation: Evolution strategies and genetic programming Evolution strategies Evolution.
Optimization in Engineering Design Georgia Institute of Technology Systems Realization Laboratory Mixed Integer Problems Most optimization algorithms deal.
Lecture 8: 24/5/1435 Genetic Algorithms Lecturer/ Kawther Abas 363CS – Artificial Intelligence.
Genetic algorithms Charles Darwin "A man who dares to waste an hour of life has not discovered the value of life"
Dr.Abeer Mahmoud ARTIFICIAL INTELLIGENCE (CS 461D) Dr. Abeer Mahmoud Computer science Department Princess Nora University Faculty of Computer & Information.
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
1 “Genetic Algorithms are good at taking large, potentially huge search spaces and navigating them, looking for optimal combinations of things, solutions.
Derivative Free Optimization G.Anuradha. Contents Genetic Algorithm Simulated Annealing Random search method Downhill simplex method.
Validation Dr Andy Evans. Preparing to model Verification Calibration/Optimisation Validation Sensitivity testing and dealing with error.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Artificial Intelligence for Games Online and local search
The Modelling Process Dr Andy Evans. This lecture The modelling process: Identify interesting patterns Build a model of elements you think interact and.
Edge Assembly Crossover
Genetic Algorithms. 2 Overview Introduction To Genetic Algorithms (GAs) GA Operators and Parameters Genetic Algorithms To Solve The Traveling Salesman.
EE749 I ntroduction to Artificial I ntelligence Genetic Algorithms The Simple GA.
Optimization Problems
Exploring Microsimulation Methodologies for the Estimation of Household Attributes Dimitris Ballas, Graham Clarke, and Ian Turton School of Geography University.
Genetic Algorithms MITM613 (Intelligent Systems).
1 Autonomic Computer Systems Evolutionary Computation Pascal Paysan.
Genetic Algorithms Chapter Description of Presentations
Artificial Intelligence in Game Design Lecture 20: Hill Climbing and N-Grams.
Genetic Algorithms. Underlying Concept  Charles Darwin outlined the principle of natural selection.  Natural Selection is the process by which evolution.
Genetic Algorithm Dr. Md. Al-amin Bhuiyan Professor, Dept. of CSE Jahangirnagar University.
Artificial Intelligence By Mr. Ejaz CIIT Sahiwal Evolutionary Computation.
Advanced AI – Session 6 Genetic Algorithm By: H.Nematzadeh.
Genetic Algorithms An Evolutionary Approach to Problem Solving.
Genetic Algorithm. Outline Motivation Genetic algorithms An illustrative example Hypothesis space search.
 Presented By: Abdul Aziz Ghazi  Roll No:  Presented to: Sir Harris.
Hirophysics.com The Genetic Algorithm vs. Simulated Annealing Charles Barnes PHY 327.
USING MICROBIAL GENETIC ALGORITHM TO SOLVE CARD SPLITTING PROBLEM.
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Modelling Dr Andy Evans In this lecture we'll look at modelling.
Beyond Classical Search
Calibration/Optimisation
Introduction to modelling
Presentation transcript:

Calibration/Optimisation Dr Andy Evans

Preparing to model Verification Calibration/Optimisation Validation Sensitivity testing and dealing with error

Parameters Ideally we’d have rules that determined behaviour: If AGENT in CROWD move AWAY But in most of these situations, we need numbers: if DENSITY > 0.9 move 2 SQUARES NORTH Indeed, in some cases, we’ll always need numbers: if COST buy CAR Some you can get from data, some you can guess at, some you can’t.

Calibration Models rarely work perfectly. Aggregate representations of individual objects. Missing model elements Error in data If we want the model to match reality, we may need to adjust variables/model parameters to improve fit. This process is calibration. First we need to decide how we want to get to a realistic picture.

Model runs Initialisation: do you want your model to: evolve to a current situation? start at the current situation and stay there? What data should it be started with? You then run it to some condition: some length of time? some closeness to reality? Compare it with reality (we’ll talk about this in a bit).

Calibration methodologies If you need to pick better parameters, this is tricky. What combination of values best model reality? Using expert knowledge. Can be helpful, but experts often don’t understand the inter-relationships between variables well. Experimenting is lots of different values. Rarely possible with more than two or three variables because of the combinatoric solution space that must be explored. Deriving them from data automatically.

Solution spaces A landscape of possible variable combinations. Usually want to find the minimum value of some optimisation function – usually the error between a model and reality. Potential solutions Optimisation of function Local minimaGlobal minimum (lowest)

Calibration Automatic calibration means sacrificing some of your data to generating the optimisation function scores. Need a clear separation between calibration and data used to check the model is correct or we could just be modelling the calibration data, not the underlying system dynamics (“over fitting”). To know we’ve modelled these, we need independent data to test against. This will prove the model can represent similar system states without re-calibration.

Heuristics (rule based) Given we can’t explore the whole space, how do we navigate? Use rules of thumb. A good example is the “greedy” algorithm: “Alter solutions slightly, but only keep those which improve the optimisation” (Steepest gradient/descent method). Zoning scheme Optimisation of function

Example: Microsimulation Basis for many other techniques. An analysis technique on its own. Simulates individuals from aggregate data sets. Allows you to estimate numbers of people effected by policies. Could equally be used on tree species or soil types. Increasingly the starting point for ABM.

How? Combines anonymised individual-level samples with aggregate population figures. Take known individuals from small scale surveys. British Household Panel Survey British Crime Survey Lifestyle databases Take aggregate statistics where we don’t know about individuals. UK Census Combine them on the basis of as many variables as they share.

MicroSimulation Randomly put individuals into an area until the population numbers match. Swap people out with others while it improves the match between the real aggregate variables and the synthetic population. Use these to model direct effects. If we have distance to work data and employment, we can simulate people who work in factory X in ED Y. Use these to model multiplier effects. If the factory shuts down, and those people are unemployed, and their money lost from that ED, how many people will the local supermarket sack?

Heuristics (rule based) “Alter solutions slightly, but only keep those which improve the optimisation” (Steepest gradient/descent method). Finds a solution, but not necessarily the “best”. Zoning scheme Optimisation of function Local minima Global minimum (lowest) Stuck!

Meta-heuristic optimisation Randomisation Simulated annealing Genetic Algorithm/Programming

Typical method: Randomisation Randomise starting point. Randomly change values, but only keep those that optimise our function. Repeat and keep the best result. Aims to find the global minimum by randomising starts.

Simulated Annealing (SA) Based on the cooling of metals, but replicates the intelligent notion that trying non-optimal solutions can be beneficial. As the temperature drops, so the probability of metal atoms freezing where they are increases, but there’s still a chance they’ll move elsewhere. The algorithm moves freely around the solution space, but the chances of it following a non-improving path drop with “temperature” (usually time). In this way there’s a chance early on for it to go into less-optimal areas and find the global minimum. But how is the probability determined?

The Metropolis Algorithm Probability of following a worse path… P = exp[ -(drop in optimisation / temperature)] (This is usually compared with a random number) Paths that increase the optimisation are always followed. The “temperature” change varies with implementation, but broadly decreases with time or area searched. Picking this is the problem: too slow a decrease and it’s computationally expensive, too fast and the solution isn’t good.

Genetic Algorithms (GA) In the 1950’s a number of people tried to use evolution to solve problems. The main advances were completed by John Holland in the mid-60’s to 70’s. He laid down the algorithms for problem solving with evolution – derivatives of these are known as Genetic Algorithms.

The basic Genetic Algorithm 1)Define the problem / target: usually some function to optimise or target data to model. 2)Characterise the result / parameters you’re looking for as a string of numbers. These are individual’s genes. 3)Make a population of individuals with random genes. 4)Test each to see how closely it matches the target. 5)Use those closest to the target to make new genes. 6)Repeat until the result is satisfactory.

A GA example Say we have a valley profile we want to model as an equation. We know the equation is in the form… y = a + b + c 2 + d 3. We can model our solution as a string of four numbers, representing a, b, c and d. We randomise this first (e.g. to get “ ”), 30 times to produce a population of thirty different random individuals. We work out the equation for each, and see what the residuals are between the predicted and real valley profile. We keep the best genes, and use these to make the next set of genes. How do we make the next genes?

Inheritance, cross-over reproduction and mutation We use the best genes to make the next population. We take some proportion of the best genes and randomly cross-over portions of them. 16|8516|37 39|3739|85 We allow the new population to inherit these combined best genes (i.e. we copy them to make the new population). We then randomly mutate a few genes in the new population

Other details Often we don’t just take the best – we jump out of local minima by taking worse solutions. Usually this is done by setting the probability of taking a gene into the next generation as based on how good it is. The solutions can be letters as well (e.g. evolving sentences) or true / false statements. The genes are usually represented as binary figures, and switched between one and zero. E.g. 1 | 7 | 3 | 7 would be 0001 | 0111 | 0011 | 0111

Can we evolve anything else? In the late 80’s a number of researchers, most notably John Koza and Tom Ray came up with ways of evolving equations and computer programs. This has come to be known as Genetic Programming. Genetic Programming aims to free us from the limits of our feeble brains and our poor understanding of the world, and lets something else work out the solutions.

Genetic Programming (GP) Essentially similar to GAs only the components aren’t just the parameters of equations, they’re the whole thing. They can even be smaller programs or the program itself. Instead of numbers, you switch and mutate… Variables, constants and operators in equations. Subroutines, code, parameters and loops in programs. All you need is some measure of “fitness”.

Advantages of GP and GA Gets us away from human limited knowledge. Finds near-optimal solutions quickly. Relatively simple to program. Don’t need much setting up.

Disadvantages of GP and GA The results are good representations of reality, but they’re often impossible to relate to physical / causal systems. E.g. river level = (2.443 x rain) rain -2 + ½ rain Usually have no explicit memory of event sequences. GPs have to be reassessed entirely to adapt to changes in the target data if it comes from a dynamic system. Tend to be good at finding initial solutions, but slow to become very accurate – often used to find initial states for other AI techniques.

Uses in ABM Behavioural models Evolve Intelligent Agents that respond to modelled economic and environmental situations realistically. (Most good conflict-based computer games have GAs driving the enemies so they adapt to changing player tactics) Heppenstall (2004); Kim (2005) Calibrating models

Other uses As well as searches in solution space, we can use these techniques to search in other spaces as well. Searches for troughs/peaks (clusters) of a variable in geographical space. e.g. cancer incidences. Searches for troughs (clusters) of a variable in variable space. e.g. groups with similar travel times to work.