Mental Health and Wellness Resources

Slides:



Advertisements
Similar presentations
Lindsey Bleimes Charlie Garrod Adam Meyerson
Advertisements

G5BAIM Artificial Intelligence Methods
© J. Christopher Beck Lecture 17: Tabu Search.
Dijkstra’s Algorithm Keep Going!. Pre-Computing Shortest Paths How many paths to pre-compute? Recall: –Using single-source to single-dest find_path: Need.
Local search algorithms
Local search algorithms
Two types of search problems
Optimization via Search CPSC 315 – Programming Studio Spring 2009 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Nature’s Algorithms David C. Uhrig Tiffany Sharrard CS 477R – Fall 2007 Dr. George Bebis.
Iterative Improvement Algorithms
Iterative Improvement Algorithms For some problems, path to solution is irrelevant: just want solution Start with initial state, and change it iteratively.
EDA (CS286.5b) Day 7 Placement (Simulated Annealing) Assignment #1 due Friday.
Traveling Salesman Problem Continued. Heuristic 1 Ideas? –Go from depot to nearest delivery –Then to delivery closest to that –And so on until we are.
Introduction to Simulated Annealing 22c:145 Simulated Annealing  Motivated by the physical annealing process  Material is heated and slowly cooled.
Optimization via Search CPSC 315 – Programming Studio Spring 2008 Project 2, Lecture 4 Adapted from slides of Yoonsuck Choe.
Randomized Algorithm. NP-Complete Problem  A problem that, right now, we need exhaustive search  Example:  SAT  TSP  Vertex Cover  Etc.
Vilalta&Eick: Informed Search Informed Search and Exploration Search Strategies Heuristic Functions Local Search Algorithms Vilalta&Eick: Informed Search.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
The Basics and Pseudo Code
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
Iterative Improvement Algorithm 2012/03/20. Outline Local Search Algorithms Hill-Climbing Search Simulated Annealing Search Local Beam Search Genetic.
Simulated Annealing. Difficulty in Searching Global Optima starting point descend direction local minima global minima barrier to local search.
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
Local search algorithms In many optimization problems, the state space is the space of all possible complete solutions We have an objective function that.
Optimization Problems
Ramakrishna Lecture#2 CAD for VLSI Ramakrishna
Chapter 5. Advanced Search Fall 2011 Comp3710 Artificial Intelligence Computing Science Thompson Rivers University.
Intro. ANN & Fuzzy Systems Lecture 37 Genetic and Random Search Algorithms (2)
CMPT 463. What will be covered A* search Local search Game tree Constraint satisfaction problems (CSP)
Traveling Courier / Milestone 4 Continued. Recall Pre-compute all shortest paths you might need? –Then just look up delays during pertubations How many.
Lecture 3: Uninformed Search
Optimization Problems
Optimization via Search
School of Computer Science & Engineering
CSCI 4310 Lecture 10: Local Search Algorithms
Measuring Where CPU Time Goes
BackTracking CS255.
CPU Efficiency Issues.
Local Search Algorithms
Data Mining (and machine learning)
Artificial Intelligence (CS 370D)
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
Comparing Genetic Algorithm and Guided Local Search Methods
HW #1 Due 29/9/2008 Write Java Applet to solve Goats and Cabbage “Missionaries and cannibals” problem with the following search algorithms: Breadth first.
Randomized Hill Climbing
Heuristic search INT 404.
Haim Kaplan and Uri Zwick
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
CS Fall 2016 (Shavlik©), Lecture 9, Week 5
Heuristics Local Search
School of Computer Science & Engineering
Lecture 9: Tabu Search © J. Christopher Beck 2005.
Artificial Intelligence
More on Search: A* and Optimization
Heuristics Local Search
EMIS 8373: Integer Programming
Milestone 4: Courier Company
Xin-She Yang, Nature-Inspired Optimization Algorithms, Elsevier, 2014
Chapter 5. Advanced Search
Not guaranteed to find best answer, but run in a reasonable time
Artificial Intelligence
Efficiently Estimating Travel Time
Local Search Algorithms
Greg Knowles ECE Fall 2004 Professor Yu Hu Hen
Midterm Review.
CSC 380: Design and Analysis of Algorithms
Local Search Algorithms
Simulated Annealing & Boltzmann Machines
ECE457 Applied Artificial Intelligence Spring 2008 Lecture #3
Presentation transcript:

Mental Health and Wellness Resources Health and Wellness (416-978-8030) or Koffler Student Services.   Good 2 Talk (1-866-925-5454): 24 hour helpline Other 24/7:Gerstein Centre and Toronto Distress Centre at 416-408-HELP (4357). CAMH’s Center for Addiction and Mental Health at 250 College Street. 

M4 Iterative Algorithms, Continued

Solution Representation What About Depots? Solution Representation Given N pick up / drop offs 2 1 3 0 2 3 2 0 1 1 Given M courier truck depots (intersections) Don’t have to store depots Could fill in closest depot to start/end later deliveryOrder = {0, 0, 0, 1, 3, 3, 1, 2, 2, 2}

Local Permutations Representation deliveryOrder = {1, 3, 2, 1, 3, 0, 0, 2} 2 1 8 15 5 2 18 3 5 0 4 20 3 0 4 1 2 Total travel time = 81

Local Permutations Swap order of two deliveries? deliveryOrder = {1, 3, 2, 1, 3, 0, 0, 2} deliveryOrder = {1, 3, 2, 1, 3, 0, 2, 0} 2 1 8 15 14 5 2 18 23 3 5 0 4 20 3 7 0 4 1 2 Total travel time = 72

2-opt deliveryOrder = {1, 3, 2, 1, 3, 0, 0, 2} Delete two connections in path 3 0 0 1 1 2 2 3

2-opt Reconnect connections differently deliveryOrder = {1, 3, 2, 1, 3, 0, 0, 2} deliveryOrder = {1, 1 2, 3, 3, 0, 0, 2} General: Pick order for 3 sub-paths Reverse any or all sub-paths Check if legal 3 0 0 1 1 2 2 3

2-opt: Complexity 2N-1 edges/connections between delivery locations How many different 2-opts? (2N-1) choose 2 edges we could cut N = 100  199 * 198 / 2 = ~20,000 options 3 sub-paths Can order in 6 ways 23 options to reverse or not 3 sub-paths Can reverse or recombine in 48 ways Total: ~1,000,000 2-opts Fast code can try them all If you find an improvement, can try all 2-opts again Path has changed  a 2-opt that didn’t work before now might Could make your own perturbation algorithm Maybe 2-opts not best for traveling courier  many illegal 3-opts?

Heuristics Overview

Really a 200-dimensional space for N = 100. Huge! Exploring the Space Worse Travel Time Local Minima Global Minimum Better Solution Really a 200-dimensional space for N = 100. Huge!

Solution very poor  many more bad solutions than good in this space Pick a Random Order? Random Solution Worse Travel Time Better Solution Solution very poor  many more bad solutions than good in this space

Multi-Start? Travel Time Solution Best of 4 random orders Worse Travel Time Better Solution A bit better, but still not very good Space very big, with mostly bad solution points

Still not locally optimal Greedy Algorithm? Greedy Worse Travel Time Better Solution Much better! Still not locally optimal

Local Permutation (Iterative Improvement)? Worse Travel Time Better Solution Search Local Space Finds Local Minimum!

Combine with Multi-Start? Worse Travel Time Better Solution Yes – Find Many Local Minima Take Best One

Local Permutations: Can Get Stuck Worse Travel Time Better Solution Can’t Get Out of Local Minimum

More Powerful Local Permutations Worse Travel Time N=100: about 1M 2-opts, but 500M 3-opts Better Solution Explore Larger Part of Space  Less Prone to Getting Stuck But More Exploration Means More CPU Time  Need Balance

Hill Climbing? Travel Time Solution Worse Travel Time Better Solution Can Get You Unstuck from Local Minima But Watch CPU Time  Need Time to Improve Solution After It Gets Worse

Heuristic 4: Hill Climbing One way to get out of a rut Change something! Even if it looks like a bad idea at first Hill climbing Change delivery order Even though it increases travel time Good idea to save best solution first Then try local perturbation around that new solution Maybe you find something better! If not, eventually can go back to saved solution Lets you explore more options

Simulated Annealing Annealing metals Simulated annealing Cool slowly Atoms gradually move to low energy state Strong metal Simulated annealing Optimization framework that mimics annealing Start with poor initial solution & high temperature (T) Perturb solution (move atoms) Most perturbations accepted when T high Only good ones (reduce cost/energy) accepted when T approaches 0

Simulated Annealing S = InitialSolution; C = Cost (S); // E.g. travel time T = high temperature; // big number while (solution changing) { Snew = perturb (S); Cnew = Cost (S); C = Cnew - C if (Cnew < C) { S = Snew; // Update solution C = Cnew; } T = reduceTemp (T); How to perturb? Smaller perturbations as T drops? || random(0,1) < e-C/T ) { Fast update methods will let you evaluate more perturbations How quickly to reduce T?

Time Limits & Using Multiple CPUs

Managing the Time Limit Can get better results with more CPU time Multi-start: Run algorithm again with a new starting point Keep best Iterative improvement Keep looking for productive changes But 30 s limit to return solution For every problem Regardless of city size & num intersections How many starting points? How many iterative perturbations? Solution: check how much time has passed End optimization just before time limit

Time Limit #include <chrono> // Time utilities #define TIME_LIMIT 30 // m4: 30 second time limit int main ( ) { auto startTime = std::chrono::high_resolution_clock::now(); bool timeOut = false; while (!timeOut) { myOptimizer (); auto currentTime = std::chrono::high_resolution_clock::now(); auto wallClock = std::chrono::duration_cast<chrono::duration<double>> ( currentTime - startTime); // Keep optimizing until within 10% of time limit if (wallClock.count() > 0.9 * TIME_LIMIT) timeOut = true; } ... Inside namespace chrono Static member function: can call without an object Time difference Gives actual elapsed time no matter how many CPUs you are using Time difference in seconds