Fall 2003EE 5301 - VLSI Design Automation I 37 EE 5301 – VLSI Design Automation I Kia Bazargan University of Minnesota Part II: Algorithms.

Slides:



Advertisements
Similar presentations
Chapter 9 Greedy Technique. Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible - b feasible.
Advertisements

Lecture 15. Graph Algorithms
CSCE 411H Design and Analysis of Algorithms Set 8: Greedy Algorithms Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 8 1 * Slides adapted.
Lecture 12: Revision Lecture Dr John Levine Algorithms and Complexity March 27th 2006.
More Graph Algorithms Minimum Spanning Trees, Shortest Path Algorithms.
Chapter 3 The Greedy Method 3.
1 Greedy 2 Jose Rolim University of Geneva. Algorithmique Greedy 2Jose Rolim2 Examples Greedy  Minimum Spanning Trees  Shortest Paths Dijkstra.
1 Discrete Structures & Algorithms Graphs and Trees: II EECE 320.
Approximation Algorithms: Combinatorial Approaches Lecture 13: March 2.
CSE 780 Algorithms Advanced Algorithms Minimum spanning tree Generic algorithm Kruskal’s algorithm Prim’s algorithm.
ICS 252 Introduction to Computer Design Routing Fall 2007 Eli Bozorgzadeh Computer Science Department-UCI.
CS 311 Graph Algorithms. Definitions A Graph G = (V, E) where V is a set of vertices and E is a set of edges, An edge is a pair (u,v) where u,v  V. If.
VLSI Layout Algorithms CSE 6404 A 46 B 65 C 11 D 56 E 23 F 8 H 37 G 19 I 12J 14 K 27 X=(AB*CD)+ (A+D)+(A(B+C)) Y = (A(B+C)+AC+ D+A(BC+D)) Dr. Md. Saidur.
3 -1 Chapter 3 The Greedy Method 3 -2 The greedy method Suppose that a problem can be solved by a sequence of decisions. The greedy method has that each.
VLSI Layout Algorithms CSE 6404 A 46 B 65 C 11 D 56 E 23 F 8 H 37 G 19 I 12J 14 K 27 X=(AB*CD)+ (A+D)+(A(B+C)) Y = (A(B+C)+AC+ D+A(BC+D)) Dr. Md. Saidur.
Chapter 9: Greedy Algorithms The Design and Analysis of Algorithms.
ICS 252 Introduction to Computer Design winter 2005 Eli Bozorgzadeh Computer Science Department-UCI.
The Theory of NP-Completeness
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
CPSC 411, Fall 2008: Set 4 1 CPSC 411 Design and Analysis of Algorithms Set 4: Greedy Algorithms Prof. Jennifer Welch Fall 2008.
Fall 2006EE VLSI Design Automation I II-1 EE 5301 – VLSI Design Automation I Kia Bazargan University of Minnesota Part II: Algorithms.
General Routing Overview and Channel Routing
Data Structures and Algorithms Graphs Minimum Spanning Tree PLSD210.
Nirmalya Roy School of Electrical Engineering and Computer Science Washington State University Cpt S 223 – Advanced Data Structures Graph Algorithms: Minimum.
Graphs CS 400/600 – Data Structures. Graphs2 Graphs  Used to represent all kinds of problems Networks and routing State diagrams Flow and capacity.
1 GRAPHS - ADVANCED APPLICATIONS Minimim Spanning Trees Shortest Path Transitive Closure.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Greedy methods Prudence Wong
© The McGraw-Hill Companies, Inc., Chapter 3 The Greedy Method.
Chapter 9 – Graphs A graph G=(V,E) – vertices and edges
MCS312: NP-completeness and Approximation Algorithms
SPANNING TREES Lecture 21 CS2110 – Spring
CS 3343: Analysis of Algorithms Lecture 21: Introduction to Graphs.
Chapter 2 Graph Algorithms.
Minimum Spanning Trees CSE 2320 – Algorithms and Data Structures Vassilis Athitsos University of Texas at Arlington 1.
Graph Algorithms. Definitions and Representation An undirected graph G is a pair (V,E), where V is a finite set of points called vertices and E is a finite.
© 2015 JW Ryder CSCI 203 Data Structures1. © 2015 JW Ryder CSCI 203 Data Structures2.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Télécom 2A – Algo Complexity (1) Time Complexity and the divide and conquer strategy Or : how to measure algorithm run-time And : design efficient algorithms.
CSE 589 Part VI. Reading Skiena, Sections 5.5 and 6.8 CLR, chapter 37.
Lecture 11 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
1 Greedy Technique Constructs a solution to an optimization problem piece by piece through a sequence of choices that are: b feasible b locally optimal.
SPANNING TREES Lecture 20 CS2110 – Fall Spanning Trees  Definitions  Minimum spanning trees  3 greedy algorithms (incl. Kruskal’s & Prim’s)
Graphs. Graphs Similar to the graphs you’ve known since the 5 th grade: line graphs, bar graphs, etc., but more general. Those mathematical graphs are.
CSE 340: Review (at last!) Measuring The Complexity Complexity is a function of the size of the input O() Ω() Θ() Complexity Analysis “same order” Order.
Chapter 20: Graphs. Objectives In this chapter, you will: – Learn about graphs – Become familiar with the basic terminology of graph theory – Discover.
CSCE 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Spring 2012 CSCE 411, Spring 2012: Set 9 1.
Lecture. Today Problem set 9 out (due next Thursday) Topics: –Complexity Theory –Optimization versus Decision Problems –P and NP –Efficient Verification.
Introduction to NP Instructor: Neelima Gupta 1.
Data Structure Lecture 10 Thursday, 28 Aug 2005.
CSCE 411 Design and Analysis of Algorithms
COMP108 Algorithmic Foundations Greedy methods
Introduction to Algorithms
Shortest Paths and Minimum Spanning Trees
Minimum Spanning Trees
Lecture 12 Algorithm Analysis
CS 3343: Analysis of Algorithms
Greedy Technique.
ICS 353: Design and Analysis of Algorithms
ICS 252 Introduction to Computer Design
Shortest Paths and Minimum Spanning Trees
Minimum Spanning Trees
CSE 373: Data Structures and Algorithms
The Theory of NP-Completeness
What is Computer Science About? Part 2: Algorithms
Lecture 12 Algorithm Analysis
Major Design Strategies
ICS 252 Introduction to Computer Design
Time Complexity and the divide and conquer strategy
INTRODUCTION A graph G=(V,E) consists of a finite non empty set of vertices V , and a finite set of edges E which connect pairs of vertices .
Presentation transcript:

Fall 2003EE VLSI Design Automation I 37 EE 5301 – VLSI Design Automation I Kia Bazargan University of Minnesota Part II: Algorithms

Fall 2003EE VLSI Design Automation I 38 References and Copyright Textbooks referred (none required)  [Mic94] G. De Micheli “Synthesis and Optimization of Digital Circuits” McGraw-Hill,  [CLR90] T. H. Cormen, C. E. Leiserson, R. L. Rivest “Introduction to Algorithms” MIT Press,  [Sar96] M. Sarrafzadeh, C. K. Wong “An Introduction to VLSI Physical Design” McGraw-Hill,  [She99] N. Sherwani “Algorithms For VLSI Physical Design Automation” Kluwer Academic Publishers, 3 rd edition, 1999.

Fall 2003EE VLSI Design Automation I 39 References and Copyright (cont.) Slides used: (Modified by Kia when necessary)  [©Sarrafzadeh] © Majid Sarrafzadeh, 2001; Department of Computer Science, UCLA  [©Sherwani] © Naveed A. Sherwani, 1992 (companion slides to [She99])  [©Keutzer] © Kurt Keutzer, Dept. of EECS, UC-Berekeley  [©Gupta] © Rajesh Gupta UC-Irvine

Fall 2003EE VLSI Design Automation I 40 Combinatorial Optimization Problems with discrete variables  Examples: oSORTING: given N integer numbers, write them in increasing order oKNAPSACK: given a bag of size S, and items {(s 1, w 1 ), (s 2, w 2 ),..., (s n, w n )}, where s i and w i are the size and value of item i respectively, find a subset of items with maximum overall value that fit in the bag.  A problem vs. problem instance Problem complexity:  Measures of complexity  Complexity cases: average case, worst case  Tractability ­ solvable in polynomial time [©Gupta]

Fall 2003EE VLSI Design Automation I 41 Algorithm An algorithm defines a procedure for solving a computational problem  Examples: oQuick sort, bubble sort, insertion sort, heap sort oDynamic programming method for the knapsack problem Definition of complexity  Run time on deterministic, sequential machines  Based on resources needed to implement the algorithm oNeeds a cost model: memory, hardware/gates, communication bandwidth, etc. oExample: RAM model with single processor  running time  # operations [©Gupta]

Fall 2003EE VLSI Design Automation I 42 Algorithm (cont.) Definition of complexity (cont.)  Example: Bubble Sort  Scalability with respect to input size is important oHow does the running time of an algorithm change when the input size doubles? oFunction of input size (n). Examples: n 2 +3n, 2 n, n log n,... oGenerally, large input sizes are of interest (n > 1,000 or even n > 1,000,000) oWhat if I use a better compiler? What if I run the algorithm on a machine that is 10x faster? [©Gupta] for (j=1 ; j< N; j++) { for (i=j; i < N-1; i++) { if (a[i] > a[i+1]) { hold = a[i]; a[i] = a[i+1]; a[i+1] = hold; } for (j=1 ; j< N; j++) { for (i=j; i < N-1; i++) { if (a[i] > a[i+1]) { hold = a[i]; a[i] = a[i+1]; a[i+1] = hold; }

Fall 2003EE VLSI Design Automation I 43 Function Growth Examples

Fall 2003EE VLSI Design Automation I 44 Asymptotic Notions Idea:  A notion that ignores the “constants” and describes the “trend” of a function for large values of the input Definition  Big-Oh notation f(n) = O ( g(n) ) if constants K and n 0 can be found such that:  n  n 0, f(n)  K. g(n) g is called an “upper bound” for f (f is “of order” g: f will not grow larger than g by more than a constant factor) Examples: 1/3 n 2 = O (n 2 ) (also O(n 3 ) ) 0.02 n n = O (n 2 )

Fall 2003EE VLSI Design Automation I 45 Asymptotic Notions (cont.) Definition (cont.)  Big-Omega notation f(n) =  ( g(n) ) if constants K and n 0 can be found such that:  n  n 0, f(n)  K. g(n) g is called a “lower bound” for f  Big-Theta notation f(n) =  ( g(n) ) if g is both an upper and lower bound for f Describes the growth of a function more accurately than O or  Example: n n   (n 2 ) 4 n =  (n 2 )

Fall 2003EE VLSI Design Automation I 46 Asymptotic Notions (cont.) How to find the order of a function?  Not always easy, esp if you start from an algorithm  Focus on the “dominant” term o4 n n 2 + log n  O(n 3 ) on + n log(n)  n log (n)  n! = K n > n K > log n > log log n > K  n > log n, n log n > n, n! > n 10. What do asymptotic notations mean in practice?  If algorithm A has “time complexity” O(n 2 ) and algorithm B has time complexity O(n log n), then algorithm B is better  If problem P has a lower bound of  (n log n), then there is NO WAY you can find an algorithm that solves the problem in O(n) time.

Fall 2003EE VLSI Design Automation I 47 Problem Tractability Problems are classified into “easier” and “harder” categories  Class P: a polynomial time algorithm is known for the problem (hence, it is a tractable problem)  Class NP (non-deterministic polynomial time): ~ polynomial solution not found yet (probably does not exist)  exact (optimal) solution can be found using an algorithm with exponential time complexity Unfortunately, most CAD problems are NP  Be happy with a “reasonably good” solution [©Gupta]

Fall 2003EE VLSI Design Automation I 48 Algorithm Types Based on quality of solution and computational effort  Deterministic  Probabilistic or randomized  Approximation  Heuristics: local search Problem vs. algorithm complexity [©Gupta]

Fall 2003EE VLSI Design Automation I 49 Deterministic Algorithm Types Algorithms usually used for P problems  Exhaustive search! (aka exponential)  Dynamic programming  Divide & Conquer (aka hierarchical)  Greedy  Mathematical programming  Branch and bound Algorithms usually used for NP problems (not seeking “optimal solution”, but a “good” one)  Greedy (aka heuristic)  Genetic algorithms  Simulated annealing  Restrict the problem to a special case that is in P

Fall 2003EE VLSI Design Automation I 50 Graph Definition Graph: set of “objects” and their “connections” Formal definition:  G = (V, E), V={v 1, v 2,..., v n }, E={e 1, e 2,..., e m }  V: set of vertices (nodes), E: set of edges (links, arcs)  Directed graph: e k = (v i, v j )  Undirected graph: e k ={v i, v j }  Weighted graph: w: E  R, w(e k ) is the “weight” of e k. a c de b a c de b

Fall 2003EE VLSI Design Automation I 51 Graph Representation: Adjacency List acdebacdeb b · c · d · e · a · b·d·c·a·d·a·e·b·b·a·d· b · c · d · e · a · b·d·a·a·b·e·d·

Fall 2003EE VLSI Design Automation I 52 Graph Representation: Adjacency Matrix acdebacdeb b c d e a bcdea b c d e a bcdea

Fall 2003EE VLSI Design Automation I 53 Edge / Vertex Weights in Graphs Edge weights  Usually represent the “cost” of an edge  Examples: oDistance between two cities oWidth of a data bus  Representation oAdjacency matrix: instead of 0/1, keep weight oAdjacency list: keep the weight in the linked list item Node weight  Usually used to enforce some “capacity” constraint  Examples: oThe size of gates in a circuit oThe delay of operations in a “data dependency graph” acdeb a-+*b

Fall 2003EE VLSI Design Automation I 54 Graph Search Algorithms Purpose: to visit all the nodes Algorithms  Depth-first search  Breadth-first search  Topological Examples [©Sherwani] A B G F D E C A B DE F C G B A E GF C D

Fall 2003EE VLSI Design Automation I 55 Depth-First Search Algorithm struct vertex {... int mark; }; dfs ( v ) v.marked  1 print v for each (v, u)  E if (u.mark != 1) // not visited yet? dfs (u) Algorithm DEPTH_FIRST_SEARCH ( V, E ) for each v  V v.marked  0 // not visited yet for each v  V if (v.marked == 0) dfs (v)

Fall 2003EE VLSI Design Automation I 56 Breadth-First Search Algorithm bfs ( v, Q) v.marked  1 for each (v, u)  E if (u.mark != 1) // not visited yet? Q  Q + u Algorithm BREDTH_FIRST_SEARCH ( V, E ) Q  {v 0 } // an arbitrary node while Q != {} v  Q.pop() if (v.marked != 1) print v bfs (v) // explore successors There is something wrong with this code. What is it?

Fall 2003EE VLSI Design Automation I 57 Distance in (non-weighted) Graphs Distance d G (u,v):  Length of a shortest u--v path in G. uv d(u,v)=2 x y d(x,y) = 

Fall 2003EE VLSI Design Automation I 58 Moor’s Breadth-First Search Algorithm Objective:  Find d(u,v) for a given pair (u,v) and a shortest path u--v How does it work?  Do BFS, and assign (w) the first time you visit a node. (w)=depth in BFS. Data structure  Q a queue containing vertices to be visited  (w)length of the shortest path u--w (initially  )   (w)parent node of w on u--w

Fall 2003EE VLSI Design Automation I 59 Moor’s Breadth-First Search Algorithm Algorithm: 1.Initialize o (w)   for w  u o (u)  0 oQ  Q+u 2.If Q  , x  pop(Q) else stop. “no path u--v” 3.  (x,y)  E, oif (y)= ,  (y)  x o (y)  (x)+1 oQ  Q+y 4.If (v)=  return to step 2 5.Follow  (w) pointers from v to u.

Fall 2003EE VLSI Design Automation I 60 Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = u Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v 10 0  

Fall 2003EE VLSI Design Automation I 61 Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 1, v 2, v 5 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v  1   -uu--u-----

Fall 2003EE VLSI Design Automation I 62 Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 4, v 3, v 6, v 10 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v  2  -uuv 2 v 1 uv 5 ---v 5

Fall 2003EE VLSI Design Automation I 63 Moor’s Breadth-First Search Algorithm u v1v1 v2v2 v3v3 v4v4 v5v5 v6v6 v7v7 v8v8 v9v9 v 10 Q = v 7 Nodeuv 1 v 2 v 3 v 4 v 5 v 6 v 7 v 8 v 9 v  2  -uuv 2 v 1 uv 5 v 4 --v 5

Fall 2003EE VLSI Design Automation I 64 Notes on Moor’s Algorithm Why the problem of BREADTH_FIRST_SEARCH algorithm does not exist here? Time complexity? Space complexity?

Fall 2003EE VLSI Design Automation I 65 Distance in Weighted Graphs Why a simple BFS doesn’t work any more?  Your locally best solution is not your globally best one  First, v 1 and v 2 will be visited u v1v1 v2v =3  =u =11  =u v 2 should be revisited u v1v1 v2v =3  =u =5  =v 1

Fall 2003EE VLSI Design Automation I 66 Dijkstra’s Algorithm Objective:  Find d(u,v) for all pairs (u,v) (fixed u) and the corresponding shortest paths u--v How does it work?  Start from the source, augment the set of nodes whose shortest path is found.  decrease (w) from  to d(u,v) as you find shorter distances to w.  (w) changed accordingly. Data structure:  S the set of nodes whose d(u,v) is found  (w)current length of the shortest path u--w   (w)current parent node of w on u—w

Fall 2003EE VLSI Design Automation I 67 Dijkstra’s Algorithm O(e) O(v 2 ) Algorithm: 1.Initialize o (v)   for v  u o (u)  0 oS  {u} 2.For each v  S’ s.t. u i v  E, oIf (v) > (u i ) + w(u i v), (v)  (u i ) + w(u i v)  (v)  u i 3.Find m=min{ (v)|v  S’ } and (v j )==m oS  S  {v j } 4.If |S|<|V|, goto step 2

Fall 2003EE VLSI Design Automation I 68 Dijkstra’s Algorithm - why does it work? u0u0 v x Proof by contradiction  Suppose v is the first one being added to S such that (v) > d(u 0,v)  Let x be the first node in S’ which makes the shortest u 0 --v path  (x) contradiction

Fall 2003EE VLSI Design Automation I 69 Minimum Spanning Tree (MST) Tree (usually undirected):  Connected graph with no cycles  |E| = |V| - 1 Spanning tree  Connected subgraph that covers all vertices  If the original graph not tree, graph has several spanning trees Minimum spanning tree  Spanning tree with minimum sum of edge weights (among all spanning trees)  Example: build a railway system to connect N cities, with the smallest total length of the railroad

Fall 2003EE VLSI Design Automation I 70 Minimum Spanning Tree Algorithms Basic idea:  Start from a vertex (or edge), and expand the tree, avoiding loops (i.e., add a “safe” edge)  Pick the minimum weight edge at each step Known algorithms  Prim: start from a vertex, expand the connected tree  Kruskal: start with the min weight edge, add min weight edges while avoiding cycles (build a forest of small trees, merge them)

Fall 2003EE VLSI Design Automation I 71 Prim’s Algorithm for MST Data structure:  Sset of nodes added to the tree so far  S’set of nodes not added to the tree yet  Tthe edges of the MST built so far  (w)current length of the shortest edge (v, w) that connects w to the current tree   (w)potential parent node of w in the final MST (current parent that connects w to the current tree)

Fall 2003EE VLSI Design Automation I 72 Prim’s Algorithm  Initialize S, S’ and T oS  {u 0 }, S’  V \ {u 0 } // u 0 is any vertex oT  { } o  v  S’, (v)    Initialize and  for the vertices adjacent to u 0 oFor each v  S’ s.t. (u 0,v)  E, (v)   ((u 0,v))  (v)  u 0  While (S’ !=  ) oFind u  S’, s.t.  v  S’, (u)  (v) oS  S  {u}, S’  S’ \ {u}, T  T  { (  (u), u) } oFor each v s.t. (u, v)  E, If (v) >  ((u,v)) then (v)   ((u,v))  (v)  u

Fall 2003EE VLSI Design Automation I 73 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1 } Nodev 1 v 2 v 3 v 4 v   -v 1 v 1 v

Fall 2003EE VLSI Design Automation I 74 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1 } Nodev 1 v 2 v 3 v 4 v   -v 1 v 1 v

Fall 2003EE VLSI Design Automation I 75 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2 } Nodev 1 v 2 v 3 v 4 v   -v 1 v 1 v

Fall 2003EE VLSI Design Automation I 76 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 2 v

Fall 2003EE VLSI Design Automation I 77 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 2 v

Fall 2003EE VLSI Design Automation I 78 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 2 v

Fall 2003EE VLSI Design Automation I 79 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 3 v

Fall 2003EE VLSI Design Automation I 80 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 3 v

Fall 2003EE VLSI Design Automation I 81 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3, v 4 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 3 v

Fall 2003EE VLSI Design Automation I 82 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3, v 4 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 3 v

Fall 2003EE VLSI Design Automation I 83 Prim’s Algorithm Example v1v1 v2v2 v3v3 v4v4 v5v5 S = {v 1, v 2, v 3, v 4, v 5 } Nodev 1 v 2 v 3 v 4 v  -v 1 v 2 v 3 v

Fall 2003EE VLSI Design Automation I 84 Other Graph Algorithms of Interest... Min-cut partitioning Graph coloring Maximum clique, independent set Min-cut algorithms Steiner tree Matching...