Download presentation
Presentation is loading. Please wait.
1
ICS 252 Introduction to Computer Design
fall 2008 Eli Bozorgzadeh Computer Science Department-UCI
2
References and Copyright
Textbooks referred [Mic94] G. De Micheli “Synthesis and Optimization of Digital Circuits” McGraw-Hill, 1994. [CLR90] T. H. Cormen, C. E. Leiserson, R. L. Rivest “Introduction to Algorithms” MIT Press, 1990. [Sar96] M. Sarrafzadeh, C. K. Wong “An Introduction to VLSI Physical Design” McGraw-Hill, 1996. [She99] N. Sherwani “Algorithms For VLSI Physical Design Automation” Kluwer Academic Publishers, 3rd edition, 1999. fall 2006 ICS 252 Introduction to Computer Design
3
References and Copyright (cont.)
Slides all from: Slides references: [©Sherwani] © Naveed A. Sherwani, (companion slides to [She99]) [©Keutzer] © Kurt Keutzer, Dept. of EECS, UC-Berekeley [©Gupta] © Rajesh Gupta UC-San Diego fall 2006 ICS 252 Introduction to Computer Design
4
Introduction to Computer Design
Reading Assignment Textbook: Chapter 2 Sections 2.1….2.5 Lecture note: Lecture note 2 fall 2006 ICS 252 Introduction to Computer Design
5
Combinatorial Optimization
Problems with discrete variables Examples: SORTING: given N integer numbers, write them in increasing order KNAPSACK: given a bag of size S, and items {(s1, w1), (s2, w2), ..., (sn, wn)}, where si and wi are the size and value of item i respectively, find a subset of items with maximum overall value that fit in the bag. More examples: A problem vs. problem instance Problem complexity: Measures of complexity Complexity cases: average case, worst case Tractability solvable in polynomial time [©Gupta] fall 2006 ICS 252 Introduction to Computer Design
6
Introduction to Computer Design
Algorithm An algorithm defines a procedure for solving a computational problem Examples: Quick sort, bubble sort, insertion sort, heap sort Dynamic programming method for the knapsack problem Definition of complexity Run time on deterministic, sequential machines Based on resources needed to implement the algorithm Needs a cost model: memory, hardware/gates, communication bandwidth, etc. Example: RAM model with single processor running time # operations [©Gupta] fall 2006 ICS 252 Introduction to Computer Design
7
Introduction to Computer Design
Algorithm (cont.) Definition of complexity (cont.) Example: Bubble Sort Scalability with respect to input size is important How does the running time of an algorithm change when the input size doubles? Function of input size (n). Examples: n2+3n, 2n, n log n, ... Generally, large input sizes are of interest (n > 1,000 or even n > 1,000,000) What if I use a better compiler? What if I run the algorithm on a machine that is 10x faster? for (j=1 ; j< N; j++) { for (i=j; i < N-1; i++) { if (a[i] > a[i+1]) { hold = a[i]; a[i] = a[i+1]; a[i+1] = hold; } How many steps does a single-CPU computer take to execute the the Bubble sort shown above? How do we know if the if condition is true or not. How many times would the if body be executed? How long does checking the condition take? A more “advanced” version of bubble sort is to use a flag to see if any swaps were done in one iteration. Would that change the average running time? The worst case? [©Gupta] fall 2006 ICS 252 Introduction to Computer Design
8
Function Growth Examples
Same functions plotted for different input ranges It takes some time, but eventually the exponential function will dwarf all other functions, even if it has a very small coefficient compared to them. fall 2006 ICS 252 Introduction to Computer Design
9
Introduction to Computer Design
Asymptotic Notions Idea: A notion that ignores the “constants” and describes the “trend” of a function for large values of the input Definition Big-Oh notation f(n) = O ( g(n) ) if constants K and n0 can be found such that: n n0, f(n) K. g(n) g is called an “upper bound” for f (f is “of order” g: f will not grow larger than g by more than a constant factor) Examples: /3 n2 = O (n2) (also O(n3) ) n n = O (n2) The big-oh notation is the mathematical formulation of “throwing away constants, function g bounds f for larger values of n0”. Basically, it tries to find out the eventual trend of functions (e.g., those shown in the previous slide) Prove the examples. Prove n^2=O(1/3 n^2) Prove any polynomial function is O(2^n) fall 2006 ICS 252 Introduction to Computer Design
10
Asymptotic Notions (cont.)
Definition (cont.) Big-Omega notation f(n) = ( g(n) ) if constants K and n0 can be found such that: n n0, f(n) K. g(n) g is called a “lower bound” for f Big-Theta notation f(n) = ( g(n) ) if g is both an upper and lower bound for f Describes the growth of a function more accurately than O or Example: n3 + 4 n (n2) n = (n2) O and Omega are not necessarily tight bounds, but Theta is. It “squeezes” a function both from the top and from the bottom. If f(n)=O(g(n)), then is g(n)=(f(n))? ( is Omega) Prove or disprove. fall 2006 ICS 252 Introduction to Computer Design
11
Asymptotic Notions (cont.)
How to find the order of a function? Not always easy, esp if you start from an algorithm Focus on the “dominant” term 4 n n2 + log n O(n3) n + n log(n) n log (n) n! = Kn > nK > log n > log log n > K n > log n, n log n > n, n! > n10. What do asymptotic notations mean in practice? If algorithm A has “time complexity” O(n2) and algorithm B has time complexity O(n log n), then algorithm B is better If problem P has a lower bound of (n log n), then there is NO WAY you can find an algorithm that solves the problem in O(n) time. In the list of asymptotic order of functions, where is “n!” ? It has been shown that n! = O(2^n) fall 2006 ICS 252 Introduction to Computer Design
12
Introduction to Computer Design
Problem Tractability Problems are classified into “easier” and “harder” categories Class P: a polynomial time algorithm is known for the problem (hence, it is a tractable problem) Class NP (non-deterministic polynomial time): ~ polynomial solution not found yet (probably does not exist) exact (optimal) solution can be found using an algorithm with exponential time complexity Unfortunately, most CAD problems are NP Be happy with a “reasonably good” solution [©Gupta] fall 2006 ICS 252 Introduction to Computer Design
13
Introduction to Computer Design
Algorithm Types Based on quality of solution and computational effort Deterministic Probabilistic or randomized Approximation Heuristics: local search Problem vs. algorithm complexity [©Gupta] fall 2006 ICS 252 Introduction to Computer Design
14
Deterministic Algorithm Types
Algorithms usually used for P problems Exhaustive search! (aka exponential) Dynamic programming Divide & Conquer (aka hierarchical) Greedy Mathematical programming Branch and bound Algorithms usually used for NP problems (not seeking “optimal solution”, but a “good” one) Greedy (aka heuristic) Genetic algorithms Simulated annealing Restrict the problem to a special case that is in P fall 2006 ICS 252 Introduction to Computer Design
15
Introduction to Computer Design
Graph Definition Graph: set of “objects” and their “connections” Formal definition: G = (V, E), V={v1, v2, ..., vn}, E={e1, e2, ..., em} V: set of vertices (nodes), E: set of edges (links, arcs) Directed graph: ek = (vi, vj) Undirected graph: ek={vi, vj} Weighted graph: w: E R, w(ek) is the “weight” of ek. a c d e b a c d e b - Note the colored edges in the directed graph fall 2006 ICS 252 Introduction to Computer Design
16
Graph Representation: Adjacency List
b b d c b a d e d b c a d c e b a e d a a b b d b d e d b c a d c a e e fall 2006 ICS 252 Introduction to Computer Design
17
Graph Representation: Adjacency Matrix
b c d e a a 1 b 1 e d b c 1 c d 1 e 1 a b c d e a a 1 e d b 1 b c 1 c d 1 e fall 2006 ICS 252 Introduction to Computer Design
18
Edge / Vertex Weights in Graphs
-2 Edge weights Usually represent the “cost” of an edge Examples: Distance between two cities Width of a data bus Representation Adjacency matrix: instead of 0/1, keep weight Adjacency list: keep the weight in the linked list item Node weight Usually used to enforce some “capacity” constraint The size of gates in a circuit The delay of operations in a “data dependency graph” a 2 3 e d b 1 1.5 c a b 5 5 + 1 - 1 * 3 fall 2006 ICS 252 Introduction to Computer Design
19
Introduction to Computer Design
Hypergraphs Hypergraph definition: Similar to graphs, but edges not between pairs of vertices, but between a set of vertices Directed / undirected versions possible Just like graphs, a node can be the source (or be connected to) multiple hyperedges Data structure? 1 2 4 3 5 1 3 5 2 4 fall 2006 ICS 252 Introduction to Computer Design
20
Graph Search Algorithms
Purpose: to visit all the nodes Algorithms Depth-first search Breadth-first search Topological Examples A B A A B G C F B C D - In the middle fig, in what order should the edges be visited for the traversal to be breadth-first? E D D E G F E C F G [©Sherwani] fall 2006 ICS 252 Introduction to Computer Design
21
Depth-First Search Algorithm
struct vertex { ... int mark; }; dfs ( v ) v.marked 1 print v for each (v, u) E if (u.mark != 1) // not visited yet? dfs (u) Algorithm DEPTH_FIRST_SEARCH ( V, E ) for each v V v.marked 0 // not visited yet if (v.marked == 0) dfs (v) fall 2006 ICS 252 Introduction to Computer Design
22
Breadth-First Search Algorithm
bfs ( v , Q) v.marked 1 for each (v, u) E if (u.mark != 1) // not visited yet? Q Q + u Algorithm BREADTH_FIRST_SEARCH ( V, E ) Q {v0} // an arbitrary node while Q != {} v Q.pop() if (v.marked != 1) print v bfs (v) // explore successors fall 2006 ICS 252 Introduction to Computer Design
23
Distance in (non-weighted) Graphs
Distance dG(u,v): Length of a shortest u--v path in G. x y u v d(x,y) = d(u,v)=2 fall 2006 ICS 252 Introduction to Computer Design
24
Moor’s Breadth-First Search Algorithm
Objective: Find d(u,v) for a given pair (u,v) and a shortest path u--v How does it work? Do BFS, and assign (w) the first time you visit a node. (w)=depth in BFS. Data structure Q a queue containing vertices to be visited (w) length of the shortest path u--w (initially ) (w) parent node of w on u--w fall 2006 ICS 252 Introduction to Computer Design
25
Moor’s Breadth-First Search Algorithm
Initialize (w) for wu (u) 0 Q Q+u If Q , x pop(Q) else stop. “no path u--v” (x,y)E, if (y)=, (y) x (y) (x)+1 Q Q+y If (v)= return to step 2 Follow (w) pointers from v to u. fall 2006 ICS 252 Introduction to Computer Design
26
Moor’s Breadth-First Search Algorithm
u v2 v3 v1 v4 v6 v5 v7 v8 v9 v10 Q = u Node u v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 0 fall 2006 ICS 252 Introduction to Computer Design
27
Moor’s Breadth-First Search Algorithm
u v2 v3 v1 v4 v6 v5 v7 v8 v9 v10 Q = v1, v2, v5 Node u v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 1 - u u - - u fall 2006 ICS 252 Introduction to Computer Design
28
Moor’s Breadth-First Search Algorithm
u v2 v3 v1 v4 v6 v5 v7 v8 v9 In the transition from the previous slide to this one, which one of v1 and v5 were visited first? Prove your answer (there are two methods for proving this) Can we say anything about V2? v10 Q = v4, v3, v6, v10 Node u v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 2 - u u v2 v1 u v v5 fall 2006 ICS 252 Introduction to Computer Design
29
Moor’s Breadth-First Search Algorithm
u v2 v3 v1 v4 v6 v5 v7 v8 v9 v10 Q = v7 Node u v1 v2 v3 v4 v5 v6 v7 v8 v9 v10 2 - u u v2 v1 u v5 v4 - - v5 fall 2006 ICS 252 Introduction to Computer Design
30
Notes on Moor’s Algorithm
Why the problem of BREADTH_FIRST_SEARCH algorithm does not exist here? Time complexity? Space complexity? fall 2006 ICS 252 Introduction to Computer Design
31
Distance in Weighted Graphs
u v1 v2 3 11 2 =3 =u =11 Why a simple BFS doesn’t work any more? Your locally best solution is not your globally best one First, v1 and v2 will be visited v2 should be revisited u v1 v2 3 11 2 =3 =u =5 =v1 fall 2006 ICS 252 Introduction to Computer Design
32
Introduction to Computer Design
Dijkstra’s Algorithm Objective: Find d(u,v) for all pairs (u,v) (fixed u) and the corresponding shortest paths u--v How does it work? Start from the source, augment the set of nodes whose shortest path is found. decrease (w) from to d(u,v) as you find shorter distances to w. (w) changed accordingly. Data structure: S the set of nodes whose d(u,v) is found (w) current length of the shortest path u--w (w) current parent node of w on u—w fall 2006 ICS 252 Introduction to Computer Design
33
Introduction to Computer Design
Dijkstra’s Algorithm Algorithm: Initialize (v) for vu (u) 0 S {u} For each vS’ s.t. uivE, If (v) > (ui) + w(ui v), (v) (ui) + w(ui v) (v) ui Find m=min{(v)|vS’ } and (vj)==m S S {vj} If |S|<|V|, goto step 2 If a simple data structure is used, time complexity will be O(v^2). How did we get this time complexity? If we use a binary heap (aka priority queue), time complexity will be O((V+E) log V). How did we calculate this time complexity? fall 2006 ICS 252 Introduction to Computer Design
34
Dijkstra’s Algorithm - why does it work?
Proof by contradiction Suppose v is the first node being added to S such that (v) > d(u0,v) (d(u0,v) is the “real” shortest u0--v path) The assumption that (v) and d(u0,v) are different, means there are different paths with lengths (v) and d(u0,v) Consider the path that has length d(u0,v). Let x be the first node in S’ on this path d(u0,v) < (v) , d(u0,v)≥(x)+ a => (x)< (v) => contradiction u0 v x If we assume (v) > d(u0,v), ( is lambda) for a node that is just being added to S, then there MUST BE a node X on its shortest path in S’. Why? In this proof, I implicitly assumed that values that are added to S monotonically get larger, which is trivial to prove. This assumption was used in d(u0,v) ≥ (x). fall 2006 ICS 252 Introduction to Computer Design
35
Minimum Spanning Tree (MST)
Tree (usually undirected): Connected graph with no cycles |E| = |V| - 1 Spanning tree Connected subgraph that covers all vertices If the original graph not tree, graph has several spanning trees Minimum spanning tree Spanning tree with minimum sum of edge weights (among all spanning trees) Example: build a railway system to connect N cities, with the smallest total length of the railroad How to prove E=V-1 in a tree? In a tree, which node is the root? fall 2006 ICS 252 Introduction to Computer Design
36
Difference Between MST and Shortest Path
Why can’t Dijkstra solve the MST problem? Shortest path: min sum of edge weight to individual nodes MST: min sum of TOTAL edge weights that connect all vertices Proposal: Pick any vertex, run Dijkstra and note the paths to all nodes (prove no cycles created) Debunk: show a counter example a b c 5 4 2 Write on your slides! fall 2006 ICS 252 Introduction to Computer Design
37
Minimum Spanning Tree Algorithms
Basic idea: Start from a vertex (or edge), and expand the tree, avoiding loops (i.e., add a “safe” edge) Pick the minimum weight edge at each step Known algorithms Prim: start from a vertex, expand the connected tree Kruskal: start with the min weight edge, add min weight edges while avoiding cycles (build a forest of small trees, merge them) fall 2006 ICS 252 Introduction to Computer Design
38
Prim’s Algorithm for MST
Data structure: S set of nodes added to the tree so far S’ set of nodes not added to the tree yet T the edges of the MST built so far (w) current length of the shortest edge (v, w) that connects w to the current tree (w) potential parent node of w in the final MST (current parent that connects w to the current tree) fall 2006 ICS 252 Introduction to Computer Design
39
Introduction to Computer Design
Prim’s Algorithm Initialize S, S’ and T S {u0}, S’ V \ {u0} // u0 is any vertex T { } vS’ , (v) Initialize and for the vertices adjacent to u0 For each vS’ s.t. (u0,v)E, (v) w ((u0,v)) (v) u0 While (S’ != ) Find uS’, s.t. vS’ , (u) (v) S S {u}, S’ S’ \ {u}, T T { (p(u), u) } For each v s.t. (u, v) E, If (v) > w((u,v)) then (v) w((u,v)) p(v) u O(E log V + V log V) O (E log V) fall 2006 ICS 252 Introduction to Computer Design
40
Other Graph Algorithms of Interest...
Min-cut partitioning Graph coloring Maximum clique, independent set Steiner tree Matching fall 2006 ICS 252 Introduction to Computer Design
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.