Presentation is loading. Please wait.

Presentation is loading. Please wait.

Clase 2, Parte 1: Conceptos Básicos de Búsqueda. Algoritmo de Ascenso de Colina. Enunciado Tarea 1 Diseño de Algoritmos (Heurísticas Modernas) Gabriela.

Similar presentations


Presentation on theme: "Clase 2, Parte 1: Conceptos Básicos de Búsqueda. Algoritmo de Ascenso de Colina. Enunciado Tarea 1 Diseño de Algoritmos (Heurísticas Modernas) Gabriela."— Presentation transcript:

1 Clase 2, Parte 1: Conceptos Básicos de Búsqueda. Algoritmo de Ascenso de Colina. Enunciado Tarea 1 Diseño de Algoritmos (Heurísticas Modernas) Gabriela Ochoa http://www.ldc.usb.ve/~gabro/

2 Basic Concepts Representation: Encodes alternative candidate solutions for manipulation Objective: describes the purpose to be fulfilled Evaluation Function: returns a specific value that indicates the quality of any particular solution given the representation

3 Search Problem Definition: Given a search space S and its feasible part F in S, find x Є F such that  eval(x) ≤ eval(y), for all y Є F (minimization) The point x that satisfies the above condition is called global solution The terms “search problem” and “optimization problem” are considered synonymous. The search for the best solution is the optimization problem

4 Boolean satisfiability problem (SAT ) An instance of the problem: is defined by a Boolean expression written using only AND, OR, NOT, variables, and parentheses. The question is: given the expression, is there some assignment of TRUE and FALSE values to the variables that will make the entire expression true? SAT is of central importance in various areas of computer science, including theoretical computer science, algorithmics, artificial intelligence, hardware design and verification. computer sciencetheoretical computer sciencealgorithmicsartificial intelligence

5 Computational Complexity of SAT SAT is NP-complete. In fact, it was the first known NP-complete problem, as proved by Stephen Cook in 1971NP-completeStephen Cook The problem remains NP-complete even if all expressions are written in conjunctive normal form with 3 variables per clause (3-CNF)conjunctive normal form  (x11 OR x12 OR x13) AND  (x21 OR x22 OR x23) AND  (x31 OR x32 OR x33) AND... where each x is a variable, with or without a NOT in front of it, and each variable can appear multiple times in the expression.

6 Problem Formulation (SAT) Let us consider a problem of size 30 (i.e. 30 variables) Representation  1 True, 0 False, Binary String of length 30 Search Space  2 choices for each variable, taken over 30 variables, generates 2 30 possibilities Objective  To find the vector of bits such that the compound Boolean statement is satisfied (made true) Evaluation Function?  Not enough information to take the objective

7 Travelling salesman problem (TSP) Given a number of cities and the costs of travelling from one to the other, what is the cheapest roundtrip route that visits each city and then returns to the starting city? An equivalent formulation in terms of graph theory is: Find the Hamiltonian cycle with the least weight in a weighted graph. graph theoryHamiltonian cycle weighted graph

8 Problem Formulation (TSP) Let us consider a problem of size 30 (i.e. 30 cities) Representation: Permutation of natural numbers 1,…,30 where each number corresponds to a city to be visited in sequence Search Space: Permutations of all cities. Symmetric TSP, circuit the same regardless the starting city: (n-1)!/2 Objective: Minimize the total distance traversed, visiting each city once, and returning to the starting city. Min Sum(dist(x,y)) Evaluation Function: Map each tour to its corresponding total distance

9 Neighbourhoods and local Optima Region of the search space that is “near” to some particular point in that space S. x N(x) A search space S, a potential solution x, and its neighbourhood N(x)

10 Defining Neighbourhoods 1 Define a distance function dist on the search space S  Dist: S x S → R  N(x) = {y Є S: dist(x,y) ≤ ε } Examples:  Euclidean distance, for search spaces defined over continuous variables  Hamming distance, for search spaces definced over binary strings (e.g. SAT)

11 Use a mapping m, that defines a neighbourhood for any point x Є S 2-swap mapping: generates a new set of potential solutions from a given solution x Solutions are generated by swapping two cities from a given tour Every solution has n(n-1)/2 neighbours Example: 2 4 5 3 1 → 2 3 5 4 1, Defining Neighbourhoods, TSP

12 1-flip mapping: generates a new set of potential solutions from a given solution x Solutions are generated by flipping a single bit in the given bit string Every solution has n neighbours Example: 1 1 0 0 1 → 0 1 0 0 1 ( 1 st bit) Defining Neighbourhoods, SAT

13 Gaussian Distribution: for each variable defines a neighbourhood Mean: the current point, Std. dev.: 1/6 of the range of the variable x = (x 1, …, x n ), where l i ≤ x i ≤ u i x’ = x i + N(0,σ i ), where σ i = (u i - l i )/6 N(0,σ i ) is an independent random Gaussian number with mean zero and std. dev. σ i Defining Neighbourhoods, Real Numbers

14 Local Optimum A potential solution x Є S is a local optimum with respect to the neighbourhood N, if and only if  eval(x) ≤ eval(y), for all y Є N(x)

15 Métodos de Ascenso de Colina - 1 Usan una técnica de mejoramiento iterativo Comienzan a partir de un punto (punto actual) en el espacio de búsqueda En cada iteración, un nuevo punto es seleccionado de la vecindad del punto actual Si el nuevo punto es mejor, se transforma en em punto actual, sino otro punto vecino es seleccionado y evaluado El método termina cuando no hay mejorías, o cuando se alcanza un numero predefinido de iteraciones

16 Hillclimbing Methods - 2 May converge to local optima  usually have to start search from various starting points Initial starting points may be chosen,  randomly  according to some regular pattern  based on other information (e.g. results of a prior search)

17 Hillclimbing Methods - 3 Variations of hillclimbing algorithms differ in the way a new string is selected for comparisons with the current string One version of simple (iterated) hillclimbing method is the steepest ascent hillclimbing

18 Hillclimbing Methods - 4 Example problem: The search space is a set of binary strings v of length 30 The objective function f (to be maximized): f(v)=|11*one(v)-150| where one(v) returns the number of ones in v. e.g. v 1 =(110111101111011101101111010101) f(v 1 ) = |11*22 - 150| = 92

19 Hillclimbing Methods - 5 procedure iterated hillclimber begin t  0 repeat local  FALSE select a curent string v c at random evaluate v c repeat form 30 new strings in the neigborhood of v c by flipping single bits of v c select v n from the set of new strings with the largest value of the objective function f if f(v c ) < f(v n ) then v c  v n else local  TRUE until local t  t+1 until t=MAX end

20 Hillclimbing Methods - 6 success/failure of each iteration depends on starting point  success defined as returning a local or a global optimum in problems with many local optima a global optimum may not be found

21 Hillclimbing Methods - 7 Weaknesses:  Usually terminate at solutions that are local optima  No information as to how much the discovered local optimum deviates from the global (or even other local optima)  Obtained optimum depends on starting point  Usually no upper bound on computation time

22 Hillclimbing Methods - 8 Advantages:  Very easy to apply (only a representation, the evaluation function and a measure that defines the neigborhood around a point is needed)


Download ppt "Clase 2, Parte 1: Conceptos Básicos de Búsqueda. Algoritmo de Ascenso de Colina. Enunciado Tarea 1 Diseño de Algoritmos (Heurísticas Modernas) Gabriela."

Similar presentations


Ads by Google