Analysis of the Traveling Salesman Problem and current approaches for solving it. Rishi B. Jethwa and Mayank Agarwal. CSE Department. University of Texas.

Slides:



Advertisements
Similar presentations
Algorithm Design Techniques
Advertisements

Instructor Neelima Gupta Table of Contents Approximation Algorithms.
CS6800 Advanced Theory of Computation
Introduction to Graph Theory Instructor: Dr. Chaudhary Department of Computer Science Millersville University Reading Assignment Chapter 1.
Types of Algorithms.
Lecture 24 Coping with NPC and Unsolvable problems. When a problem is unsolvable, that's generally very bad news: it means there is no general algorithm.
Reducibility Class of problems A can be reduced to the class of problems B Take any instance of problem A Show how you can construct an instance of problem.
1 Appendix B: Solving TSP by Dynamic Programming Course: Algorithm Design and Analysis.
Lecture 12: Revision Lecture Dr John Levine Algorithms and Complexity March 27th 2006.
Combinatorial Algorithms
Lecture 21 Approximation Algorithms Introduction.
Algorithm Strategies Nelson Padua-Perez Chau-Wen Tseng Department of Computer Science University of Maryland, College Park.
C++ Programming: Program Design Including Data Structures, Third Edition Chapter 21: Graphs.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Approximation Algorithms
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
© 2006 Pearson Addison-Wesley. All rights reserved14 A-1 Chapter 14 Graphs.
CS333/ Topic 11 CS333 - Introduction CS333 - Introduction General information Goals.
The Traveling Salesperson Problem
Backtracking.
The Travelling Salesman Algorithm A Salesman has to visit lots of different stores and return to the starting base On a graph this means visiting every.
Busby, Dodge, Fleming, and Negrusa. Backtracking Algorithm Is used to solve problems for which a sequence of objects is to be selected from a set such.
May 5, 2015Applied Discrete Mathematics Week 13: Boolean Algebra 1 Dijkstra’s Algorithm procedure Dijkstra(G: weighted connected simple graph with vertices.
Programming & Data Structures
Theory of Computing Lecture 10 MAS 714 Hartmut Klauck.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Programming for Geographical Information Analysis: Advanced Skills Online mini-lecture: Introduction to Networks Dr Andy Evans.
The Traveling Salesperson Problem Algorithms and Networks.
The Traveling Salesman Problem Approximation
University of Texas at Arlington Srikanth Vadada Kishan Kumar B P Fall CSE 5311 Solving Travelling Salesman Problem for Metric Graphs using MST.
SPANNING TREES Lecture 21 CS2110 – Spring
1 The TSP : NP-Completeness Approximation and Hardness of Approximation All exact science is dominated by the idea of approximation. -- Bertrand Russell.
1 Introduction to Approximation Algorithms. 2 NP-completeness Do your best then.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Algorithms and Running Time Algorithm: Well defined and finite sequence of steps to solve a well defined problem. Eg.,, Sequence of steps to multiply two.
The Traveling Salesman Problem Over Seventy Years of Research, and a Million in Cash Presented by Vladimir Coxall.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Ant colony optimization. HISTORY introduced by Marco Dorigo (MILAN,ITALY) in his doctoral thesis in 1992 Using to solve traveling salesman problem(TSP).traveling.
TSP – Upper Bounds and Lower Bounds Initial problem : Upper Bound A salesman based in Stockton needs to visit shops based in Darlington, Billingham, Middlesbrough,
Types of Algorithms. 2 Algorithm classification Algorithms that use a similar problem-solving approach can be grouped together We’ll talk about a classification.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
© 2006 Pearson Addison-Wesley. All rights reserved 14 A-1 Chapter 14 Graphs.
Algorithms for hard problems Introduction Juris Viksna, 2015.
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
I can describe the differences between Hamilton and Euler circuits and find efficient Hamilton circuits in graphs. Hamilton Circuits I can compare and.
Management Science 461 Lecture 7 – Routing (TSP) October 28, 2008.
Approximation Algorithms by bounding the OPT Instructor Neelima Gupta
Instructor Neelima Gupta Table of Contents Introduction to Approximation Algorithms Factor 2 approximation algorithm for TSP Factor.
1 Euler and Hamilton paths Jorge A. Cobb The University of Texas at Dallas.
Traveling Salesman Problem DongChul Kim HwangRyol Ryu.
1 Minimum Spanning Tree: Solving TSP for Metric Graphs using MST Heuristic Soheil Shafiee Shabnam Aboughadareh.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
Design and Analysis of Approximation Algorithms
Redraw these graphs so that none of the line intersect except at the vertices B C D E F G H.
Lecture 2-2 NP Class.
Routing Through Networks - 1
Design and Analysis of Algorithm
Types of Algorithms.
Chapter 2: Business Efficiency Lesson Plan
Heuristics Definition – a heuristic is an inexact algorithm that is based on intuitive and plausible arguments which are “likely” to lead to reasonable.
Graphs Chapter 13.
Autumn 2015 Lecture 11 Minimum Spanning Trees (Part II)
Chapter 2: Business Efficiency Business Efficiency
Richard Anderson Lecture 28 Coping with NP-Completeness
Course Contents: T1 Greedy Algorithm Divide & Conquer
Spanning Trees Lecture 20 CS2110 – Spring 2015.
Chapter 14 Graphs © 2011 Pearson Addison-Wesley. All rights reserved.
Approximation Algorithms
Presentation transcript:

Analysis of the Traveling Salesman Problem and current approaches for solving it. Rishi B. Jethwa and Mayank Agarwal. CSE Department. University of Texas at Arlington.

Table of Contents. Introduction. Introduction. Naïve and simple approaches. Naïve and simple approaches. Approaches using Dynamic Programming. Approaches using Dynamic Programming. Approaches using Neural Network and Genetics. Approaches using Neural Network and Genetics. Approaches using Parallel Branch and Bound Technique. Approaches using Parallel Branch and Bound Technique.

Traveling Salesman Problem Given N points, find the shortest circular path that links all the points together. The constraint on TSP is that the first and the last point the tour should be same and the edges cannot be repeated. Vertices can be repeated. No polynomial solution exists for this type of problems which has alas combinatory solutions. So we are interested in mainly good solutions, not exact.

A naïve approach.. Suppose you are given with n points and you start with first point for second possibility you are left with n-1 options then n-2 and so on. Suppose you are given with n points and you start with first point for second possibility you are left with n-1 options then n-2 and so on. Hence there are altogether !(n-1)/2 different paths i.e. 0.5 * !(n-1) in APL notations. Hence there are altogether !(n-1)/2 different paths i.e. 0.5 * !(n-1) in APL notations. For n = 36, there are 10*38 different paths. For n = 36, there are 10*38 different paths.

Other simple approaches. Nearest Neighbour. Start at an arbitrary point and successively visit the nearest unvisited point. After all the points have been visited, return to the start point.

Other simple approaches. Minimum Spanning Tree. Construct the minimum spanning tree of the point set and duplicate all the links on the tree. Sequence the points as the would appear in a traversal of the doubled tree. Pass through the sequence and remove all representations after the first of each point.

Other simple approaches. Strip. Partition the square into sqrroot N/3 vertical strips. Sequence the points in the each strip by the vertical position, alternately top-to- bottom and bottom-to-top, and visit the strips from left to right. At last return to the starting point.

Spacefilling Approach The approach goes like this. Have a square that has all the point inside. bisect it into two triangle, name it 0 and 1. Then again bisect each into two triangle and name them for 0, 00 and 01 and for 1, 10 and 11. Continue this partition and name each partition accordingly. At last traverse the small triangle for points in them. The approach goes like this. Have a square that has all the point inside. bisect it into two triangle, name it 0 and 1. Then again bisect each into two triangle and name them for 0, 00 and 01 and for 1, 10 and 11. Continue this partition and name each partition accordingly. At last traverse the small triangle for points in them.

ComparisonsNNMSTSTRIPSpacefilling Ease of coding Good GoodPoor MemoryO(N)O(N)O(N)O(N) Parallelizabilit y Poorpoor Good Good To solve O(N 2 ) O(N 2 LogN ) O(N Log N) To modify ResolveResolve O(Log N) O(Log N) Performance on non-uniform data Good Good Poor

Diamond Method to solve TSP problems. Consider a big-diamond APL symbol, divide it into 4 parts. Consider a big-diamond APL symbol, divide it into 4 parts. Approximately 1/4 points are located at each of the four quadrants. Apply the TSP to each of the four quadrants. A catenation of these four sub paths produces a path with no loop, which is what we wanted. Approximately 1/4 points are located at each of the four quadrants. Apply the TSP to each of the four quadrants. A catenation of these four sub paths produces a path with no loop, which is what we wanted.

Dynamic Programming treatment. Apart from the vertices it also defines d(i,j) be the distance between ith and jth vertices. Apart from the vertices it also defines d(i,j) be the distance between ith and jth vertices. Then the final answer would be: Then the final answer would be: f(i;j1, j2,....,jk) = min as long as 1 <= m <= k {d(i,jm) + f(i;j1, j2,..,jm-1,jm+1,..,jk} f(i;j1, j2,....,jk) = min as long as 1 <= m <= k {d(i,jm) + f(i;j1, j2,..,jm-1,jm+1,..,jk} first we get f(i; j1, j2) then f(i; j1, j2, j3) so on until we get f(i; j1, j2.....jn). first we get f(i; j1, j2) then f(i; j1, j2, j3) so on until we get f(i; j1, j2.....jn). The running time is reduced to n 2 2 n-1 and the space requirement is 6 times that of Striling’s formula i.e. The running time is reduced to n 2 2 n-1 and the space requirement is 6 times that of Striling’s formula i.e. 2 2m / sqrt(3.14 *m). 2 2m / sqrt(3.14 *m).

Dynamic Programming treatment. The GTSP is linked to the fact that it is a combination of two problems. The GTSP is linked to the fact that it is a combination of two problems. Once a vertex is chosen, we must choose which cluster to visit first and then which vertex to visit first. Once the cluster path is determined, we are confronted to the minimum cycle path problem. Once a vertex is chosen, we must choose which cluster to visit first and then which vertex to visit first. Once the cluster path is determined, we are confronted to the minimum cycle path problem. This paper proposes the genetic algorithm for choosing a path. Here sequence of chromosomes represents a path. This paper proposes the genetic algorithm for choosing a path. Here sequence of chromosomes represents a path.

The Self-Organizing Neural Network (SONN) The SONN is arranged with M neurons and if each neuron has N-1 dimensional weight vector. The SONN is arranged with M neurons and if each neuron has N-1 dimensional weight vector. SONN will exhibit non-convergence in case when M=N. This paper proposed a new density function that guarantees the convergence even when M=N. SONN will exhibit non-convergence in case when M=N. This paper proposed a new density function that guarantees the convergence even when M=N. Also for a problem with N cities original SONN requires 2N neurons but this approach requires only N neurons. Also for a problem with N cities original SONN requires 2N neurons but this approach requires only N neurons.

Parallel Branch and Bound Technique. Uses parallel computers to solve large randomly generated ATSP. Uses parallel computers to solve large randomly generated ATSP. The principle components of the algorithm are as follows: The principle components of the algorithm are as follows: i) Lower Bounding Technique:- To find the lower bounds for the parallel ATSP algorithm by solving the assignment problem. ii) Upper Bounding Heuristic:- Use the solution to the assignment problem to construct a solution to the ATSP. iii) Branching rules:- Create two or more new sub-problems based on an assignment problem solution. This algorithm is capable of using tens to hundreds of processors depending on the problem size and difficulty. This algorithm is capable of using tens to hundreds of processors depending on the problem size and difficulty.

Parallel Branch and Bound Technique. Edward discusses the issues surrounding implementation of a particular branch and bound algorithm for the TSP on a hypercube multi-computer. Edward discusses the issues surrounding implementation of a particular branch and bound algorithm for the TSP on a hypercube multi-computer. This paper uses the Best-First Search technique for branching implementation. This paper uses the Best-First Search technique for branching implementation. TSP is one of the very few fully asynchronous applications that have been written on hypercube thus far. TSP is one of the very few fully asynchronous applications that have been written on hypercube thus far.

THANKS.