Presentation is loading. Please wait.

Presentation is loading. Please wait.

Local Search Goal is to find the local maximum (or minimum) Example: – # seconds to spin wheels at 1.0 to move 2.0 meters.

Similar presentations


Presentation on theme: "Local Search Goal is to find the local maximum (or minimum) Example: – # seconds to spin wheels at 1.0 to move 2.0 meters."— Presentation transcript:

1 Local Search Goal is to find the local maximum (or minimum) Example: – # seconds to spin wheels at 1.0 to move 2.0 meters.

2 Gradient Descent Minimum is found by following the slope of the function “Like climbing Everest in thick fog with amnesia”

3 def genData(numPoints, bias, variance): x = np.zeros(shape=(numPoints, 2)) y = np.zeros(shape=numPoints) # basically a straight line for i in range(0, numPoints): # bias feature x[i][0] = 1 x[i][1] = i # our target variable y[i] = (i + bias) + random.uniform(0, 1) * variance return x, y # gen 100 points with a bias of 25 and 10 variance as a bit of noise x, y = genData(100, 25, 10) #fix for higher dimensions. #here, m = 100, n = 2 m, n = np.shape(x) numIterations= 100000 alpha = 0.0005 theta = np.ones(n) #answer = [offset, slope] theta = gradientDescent(x, y, theta, alpha, m, numIterations) print "Our solution line has a y-intercept of ", theta[0], "and a slope of", theta[1] print "The data we're trying to fit is as follows" for i in range (0, 100): print x[i][1], y[i] starter.py

4 def gradientDescent(x, y, theta, alpha, m, numIterations): xTrans = x.transpose() replaceMe =.0001 for i in range(0, numIterations): hypothesis = np.dot(x, theta) loss = hypothesis - y # Want the mean squared error here. Only use # for debugging to make sure we're progressing cost = np.sum(replaceMe) / (m) if i % 9999 == 0: print i, cost # avg gradient per example gradient = np.dot(xTrans, replaceMe) / m # update theta = theta * replaceMe return theta gradientDescent.py

5 http://stackoverflow.com/questions/1778458 7/gradient-descent-using-python-and-numpy http://www.bogotobogo.com/python/python _numpy_batch_gradient_descent_algorithm.p hp

6 Simulated annealing search Improves on gradient descent Idea: escape local maxima by allowing some “bad” moves but gradually decrease their frequency

7 http://katrinaeg.com/simulated-annealing.html def anneal(solution): old_cost = cost(solution) T = 1.0 T_min = 0.00001 alpha = 0.9 while T > T_min: i = 1 while i <= 100: new_solution = neighbor(solution) new_cost = cost(new_solution) ap = acceptance_probability(old_cost, new_cost, T) if ap > random(): solution = new_solution old_cost = new_cost i += 1 T = T*alpha return solution, old_cost

8 http://apmonitor.com/me575/index.php/Mai n/SimulatedAnnealing

9 Genetic Algorithms Inspired by nature, based on reproduction and selection Start with randomly generated states (population) – A population member is represented by state vector in the variable space (its DNA) – A fitness function determines the quality of state New states generated from two parent states. Throw some randomness into the mix as well…

10 Genetic algorithms

11 Normalized fitness function: – 24/(24+23+20+11) = 31% – 23/(24+23+20+11) = 29% – … etc.

12 Genetic algorithms Probability of selection is weighted by the normalized fitness function.

13 Genetic algorithms Probability of selection is weighted by the normalized fitness function.

14 Genetic algorithms

15 Genetic Algorithms 1.Initialize population ( random states) 2.Calculate fitness function 3.Select pairs for crossover (weighted by normalized fitness) 4.Apply mutation (random) 5.Evaluate fitness of children 6.From the resulting population of individuals, probabilistically pick of the best 7.Repeat

16 Example: N-Queens Why does crossover make sense here? When wouldn’t it make sense? What would mutation be? What would a good fitness function be? 16

17 Genetic Algorithms Compared to gradient-based methods: – Can perform similar random jumps to simulated annealing – Greater coverage - multiple state estimates at a time – Less sensitive to local optima – Can be used for both continuous and discrete variables

18 Fitness function Population, lots of parameters Types of problems can solve – http://nn.cs.utexas.edu/pages/research/rocket/ http://nn.cs.utexas.edu/pages/research/rocket/ – http://nerogame.org/ http://nerogame.org/ Co-evolution – http://www.cs.utexas.edu/users/nn/pages/research/n eatdemo.html http://www.cs.utexas.edu/users/nn/pages/research/n eatdemo.html – http://www.cs.utexas.edu/users/nn/pages/research/ro botmovies/clip7.gif http://www.cs.utexas.edu/users/nn/pages/research/ro botmovies/clip7.gif Where will GAs not do as well in robotics (or in general)?


Download ppt "Local Search Goal is to find the local maximum (or minimum) Example: – # seconds to spin wheels at 1.0 to move 2.0 meters."

Similar presentations


Ads by Google