A Study of Genetic Algorithms for Parameter Optimization Mac Newbold
Introduction Many algorithms have constant values that affect the way they work Sometimes we choose them arbitrarily or based on some experimentation Their interactions are often not well understood Use Genetic Algorithm to optimize the parameters to an algorithm
Background Utah Network Testbed (www.emulab.net) Map a “virtual” topology graph to the physical topology graph NP-Hard, 30+ degrees of freedom “assign” – Simulated Annealing (AI algo.) 19 constants control behaviors 1 boolean, 4 integers, 15 floating point 1 integer and 3 floats are scaling factors 15 parameters need to be optimized
Genetic Algorithm Evolution, “survival of the fittest” Genetic Algorithm control – “tune” Calls object methods Replaceable object Obj->Random() – returns a random object Obj->Fitness() – calculate fitness of object Obj->Cross(obj2) – crossover (returns 2 objs) Obj->Mutate() – mutate Obj->Display() – Show the object Very Flexible Framework
Parameter Optimization “Params” object Specialized for “assign” Contains the 15 variables we want to tune One extra value caches fitness calculations Insures that values “make sense” using domain specific constraints Uniform crossover Random mutation Performance based fitness measure
Fitness Function For “assign”, we care about running time Choice of constants has huge effect Fitness calculation: Run “assign” on a set of N problems, using the object’s parameters Allow S seconds for each run X=Sum of execution times Fitness = (S*N) – X S*N = maximum possible total time Higher scores are better Could take a long time, so cache result
G.A. Results Tested genetic algorithm with “random” objects Same as “Params” object, except for fitness Random fitness, updated after cross/mutate 1000 member population Crossover rate of 0.50 Mutation rate of 0.30 Threshold = 999.995/1000 Took 7 generations, about 5 seconds
Results
What’s Next Finish setting up actual scoring using “assign” runtimes…