1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame.

Slides:



Advertisements
Similar presentations
The Theory of NP-Completeness
Advertisements

1 NP-Complete Problems. 2 We discuss some hard problems:  how hard? (computational complexity)  what makes them hard?  any solutions? Definitions 
CPSC 322, Lecture 16Slide 1 Stochastic Local Search Variants Computer Science cpsc322, Lecture 16 (Textbook Chpt 4.8) February, 9, 2009.
Introduction to Approximation Algorithms Lecture 12: Mar 1.
Computational problems, algorithms, runtime, hardness
CSE332: Data Abstractions Lecture 27: A Few Words on NP Dan Grossman Spring 2010.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 20 Instructor: Paul Beame.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 21 Instructor: Paul Beame.
NP-Complete Problems Reading Material: Chapter 10 Sections 1, 2, 3, and 4 only.
The Theory of NP-Completeness
CSE 326: Data Structures NP Completeness Ben Lerner Summer 2007.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 23 Instructor: Paul Beame.
Analysis of Algorithms CS 477/677
Computational Complexity, Physical Mapping III + Perl CIS 667 March 4, 2004.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 24 Instructor: Paul Beame.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 22 Instructor: Paul Beame.
Genetic Algorithms Nehaya Tayseer 1.Introduction What is a Genetic algorithm? A search technique used in computer science to find approximate solutions.
NP-complete and NP-hard problems. Decision problems vs. optimization problems The problems we are trying to solve are basically of two kinds. In decision.
Optimization of thermal processes2007/2008 Optimization of thermal processes Maciej Marek Czestochowa University of Technology Institute of Thermal Machinery.
Programming & Data Structures
1 The Theory of NP-Completeness 2012/11/6 P: the class of problems which can be solved by a deterministic polynomial algorithm. NP : the class of decision.
Nattee Niparnan. Easy & Hard Problem What is “difficulty” of problem? Difficult for computer scientist to derive algorithm for the problem? Difficult.
Slides are based on Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems.
SOFT COMPUTING (Optimization Techniques using GA) Dr. N.Uma Maheswari Professor/CSE PSNA CET.
1 Local search and optimization Local search= use single current state and move to neighboring states. Advantages: –Use very little memory –Find often.
Advanced Algorithm Design and Analysis (Lecture 13) SW5 fall 2004 Simonas Šaltenis E1-215b
Optimization Problems - Optimization: In the real world, there are many problems (e.g. Traveling Salesman Problem, Playing Chess ) that have numerous possible.
TECH Computer Science NP-Complete Problems Problems  Abstract Problems  Decision Problem, Optimal value, Optimal solution  Encodings  //Data Structure.
DP TSP Have some numbering of the vertices We will think of all our tours as beginning and ending at 1 C(S, j) = length of shortest path visiting each.
Image segmentation Prof. Noah Snavely CS1114
Week 10Complexity of Algorithms1 Hard Computational Problems Some computational problems are hard Despite a numerous attempts we do not know any efficient.
Techniques for Proving NP-Completeness Show that a special case of the problem you are interested in is NP- complete. For example: The problem of finding.
2005MEE Software Engineering Lecture 11 – Optimisation Techniques.
Thursday, May 9 Heuristic Search: methods for solving difficult optimization problems Handouts: Lecture Notes See the introduction to the paper.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Local Search Pat Riddle 2012 Semester 2 Patricia J Riddle Adapted from slides by Stuart Russell,
NP-COMPLETE PROBLEMS. Admin  Two more assignments…  No office hours on tomorrow.
Introduction to Genetic Algorithms. Genetic Algorithms We’ve covered enough material that we can write programs that use genetic algorithms! –More advanced.
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Probabilistic Algorithms Evolutionary Algorithms Simulated Annealing.
CS 3343: Analysis of Algorithms Lecture 25: P and NP Some slides courtesy of Carola Wenk.
Optimization Problems
CSE 589 Part V One of the symptoms of an approaching nervous breakdown is the belief that one’s work is terribly important. Bertrand Russell.
CPS Computational problems, algorithms, runtime, hardness (a ridiculously brief introduction to theoretical computer science) Vincent Conitzer.
Graph Algorithms for Vision Amy Gale November 5, 2002.
Algorithms for hard problems Introduction Juris Viksna, 2015.
Genetic Algorithms Chapter Description of Presentations
Artificial Intelligence in Game Design Lecture 20: Hill Climbing and N-Grams.
CSE 421 Algorithms Richard Anderson Lecture 27 NP-Completeness Proofs.
The NP class. NP-completeness Lecture2. The NP-class The NP class is a class that contains all the problems that can be decided by a Non-Deterministic.
CSE 332: NP Completeness, Part II Richard Anderson Spring 2016.
 Negnevitsky, Pearson Education, Lecture 12 Hybrid intelligent systems: Evolutionary neural networks and fuzzy evolutionary systems n Introduction.
1 Intro to AI Local Search. 2 Intro to AI Local search and optimization Local search: –use single current state & move to neighboring states Idea: –start.
Optimization Problems
Chapter 10 NP-Complete Problems.
Richard Anderson Lecture 26 NP-Completeness
Subject Name: Operation Research Subject Code: 10CS661 Prepared By:Mrs
NP-Completeness Yin Tat Lee
ICS 353: Design and Analysis of Algorithms
Objective of This Course
Haim Kaplan and Uri Zwick
Optimization Problems
CSE 589 Applied Algorithms Spring 1999
Heuristics Local Search
Genetic Algorithms CSCI-2300 Introduction to Algorithms
Hardness Of Approximation
NP-Completeness Yin Tat Lee
Artificial Intelligence CIS 342
Instructor: Aaron Roth
Solution methods for NP-hard Discrete Optimization Problems
Presentation transcript:

1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 25 Instructor: Paul Beame

2 What to do if the problem you want to solve is NP-hard  You might have phrased your problem too generally  e.g., in practice, the graphs that actually arise are far from arbitrary  maybe they have some special characteristic that allows you to solve the problem in your special case for example the Clique problem is easy on “interval graphs”.  search the literature to see if special cases already solved

3 What to do if the problem you want to solve is NP-hard  Try to find an approximation algorithm  Maybe you can’t get the size of the best Vertex Cover but you can find one within a factor of 2 of the best  Given graph G=(V,E), start with an empty cover  While there are still edges in E left Choose an edge e={u,v} in E and add both u and v to the cover Remove all edges from E that touch either u or v.  Edges chosen don’t share any vertices so optimal cover size must be at least # of edges chosen

4 What to do if the problem you want to solve is NP-hard  Try to find an approximation algorithm  Recent research has classified problems based on what kinds of approximations are possible if P  NP  Best: (1+  ) factor for any  >0. packing and some scheduling problems, TSP in plane  Some fixed constant factor > 1, e.g. 2, 3/2, 100 Vertex Cover, TSP in space, other scheduling problems   (log n) factor Set Cover, Graph Partitioning problems  Worst:  (n 1-  ) factor for any  >0 Clique, Independent-Set, Coloring

5 What to do if the problem you want to solve is NP-hard  Try an algorithm that is provably fast “on average”.  To even try this one needs a model of what a typical instance is.  Typically, people consider “random graphs”  e.g. all graphs with a given # of edges are equally likely  Problems:  real data doesn’t look like the random graphs  distributions of real data aren’t analyzable

6 What to do if the problem you want to solve is NP-hard  Try to search the space of possible hints in a more efficient way and hope it is quick enough  e.g. back-tracking search  For Satisfiability there are 2 n possible truth assignments  If we set the truth values one-by-one we might be able to figure out whole parts of the space to avoid, e.g. After setting x 1  1, x 2  0 we don’t even need to set x 3 or x 4 to know that it won’t satisfy (  x 1  x 2 )  (  x 2  x 3 )  (x 4   x 3 )  (x 1   x 4 )  For Satisfiability this seems to run in times like 2 n/20 on typical hard instances.  Related technique: branch-and-bound

7 What to do if the problem you want to solve is NP-hard  Use heuristic algorithms and hope they give good answers  No guarantees of quality  Many different types of heuristic algorithms  Many different options, especially for optimization problems, such as TSP, where we want the best solution.  We’ll mention several on following slides

8 Heuristic algorithms for NP-hard problems  local search for optimization problems  need a notion of two solutions being neighbors  Start at an arbitrary solution S  While there is a neighbor T of S that is better than S  S  T  Usually fast but often gets stuck in a local optimum and misses the global optimum  With some notions of neighbor can take a long time in the worst case

9 e.g., Neighboring solutions for TSP Solution SSolution T Two solutions are neighbors iff there is a pair of edges you can swap to transform one to the other

10 Heuristic algorithms for NP-hard problems  randomized local search  start local search several times from random starting points and take the best answer found from each point  more expensive than plain local search but usually much better answers  simulated annealing  like local search but at each step sometimes move to a worse neighbor with some probability  probability of going to a worse neighbor is set to decrease with time as, presumably, solution is closer to optimal  helps avoid getting stuck in a local optimum but often slow to converge (much more expensive than randomized local search)  analogy with slow cooling to get to lowest energy state in a crystal (or in forging a metal)

11 Heuristic algorithms for NP-hard problems  genetic algorithms  view each solution as a string (analogy with DNA)  maintain a population of good solutions  allow random mutations of single characters of individual solutions  combine two solutions by taking part of one and part of another (analogy with crossover in sexual reproduction)  get rid of solutions that have the worst values and make multiple copies of solutions that have the best values (analogy with natural selection -- survival of the fittest).  little evidence that they work well and they are usually very slow  as much religion as science

12 Heuristic algorithms  artificial neural networks  based on very elementary model of human neurons  Set up a circuit of artificial neurons  each artificial neuron is an analog circuit gate whose computation depends on a set of connection strengths  Train the circuit  Adjust the connection strengths of the neurons by giving many positive & negative training examples and seeing if it behaves correctly  The network is now ready to use  useful for ill-defined classification problems such as optical character recognition but not typical cut & dried problems

13 Other fun directions  DNA computing  Each possible hint for an NP problem is represented as a string of DNA  fill a test tube with all possible hints  View verification algorithm as a series of tests  e.g. checking each clause is satisfied in case of Satisfiability  For each test in turn  use lab operations to filter out all DNA strings that fail the test (works in parallel on all strings; uses PCR)  If any string remains the answer is a YES.  Relies on fact that Avogadro’s # 6 x is large to get enough strings to fit in a test-tube.  Error-prone & so far only problem sizes less than 15!

14 Other fun directions  Quantum computing  Use physical processes at the quantum level to implement weird kinds of circuit gates  unitary transformations  Quantum objects can be in a superposition of many pure states at once  can have n objects together in a superposition of 2 n states  Each quantum circuit gate operates on the whole superposition of states at once  inherent parallelism  Need totally new kinds of algorithms to work well. Theoretically able to factor efficiently but huge practical problems: errors, decoherence.