Presentation is loading. Please wait.

Presentation is loading. Please wait.

UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Final Review.

Similar presentations


Presentation on theme: "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Final Review."— Presentation transcript:

1 UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Final Review

2 Review of Key Course Material

3 What’s It All About? ä Algorithm: ä steps for the computer to follow to solve a problem ä Problem Solving Goals: ä recognize structure of some common problems ä understand important characteristics of algorithms to solve common problems ä select appropriate algorithm & data structures to solve a problem ä tailor existing algorithms ä create new algorithms

4 Some Algorithm Application Areas Computer Graphics Geographic Information Systems Robotics Bioinformatics Astrophysics Medical Imaging Telecommunications Design Apply Analyze

5 Tools of the Trade ä Algorithm Design Patterns such as: ä binary search ä divide-and-conquer ä randomized ä Data Structures such as: ä trees, linked lists, stacks, queues, hash tables, graphs, heaps, arrays Growth of Functions Summations Recurrences Sets Probability MATH Proofs

6 Discrete Math Review Growth of Functions, Summations, Recurrences, Sets, Counting, Probability

7 Topics ä Discrete Math Review : ä Sets, Basic Tree & Graph concepts ä Counting: Permutations/Combinations ä Probability: Basics, including Expectation of a Random Variable ä Proof Techniques: Induction ä Basic Algorithm Analysis Techniques: ä Asymptotic Growth of Functions ä Types of Input: Best/Average/Worst ä Bounds on Algorithm vs. Bounds on Problem ä Algorithmic Paradigms/Design Patterns: ä Divide-and-Conquer, Randomized ä Analyze pseudocode running time to form summations &/or recurrences

8 What are we measuring? ä Some Analysis Criteria: ä Scope ä The problem itself? ä A particular algorithm that solves the problem? ä “Dimension” ä Time Complexity? Space Complexity? ä Type of Bound ä Upper? Lower? Both? ä Type of Input ä Best-Case? Average-Case? Worst-Case? ä Type of Implementation ä Choice of Data Structure

9 Function Order of Growth O( ) upper bound  ( ) lower bound  ( ) upper & lower bound n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 know how to use asymptotic complexity notation to describe time or space complexity know how to order functions asymptotically (behavior as n becomes large) shorthand for inequalities

10 Types of Algorithmic Input Best-Case Input: of all possible algorithm inputs of size n, it generates the “best” result for Time Complexity: “best” is smallest running time for Time Complexity: “best” is smallest running time Best-Case Input Produces Best-Case Running Time Best-Case Input Produces Best-Case Running Time provides a lower bound on the algorithm’s asymptotic running time provides a lower bound on the algorithm’s asymptotic running time (subject to any implementation assumptions) (subject to any implementation assumptions) for Space Complexity: “best” is smallest storage for Space Complexity: “best” is smallest storage Average-Case Input Worst-Case Input these are defined similarly Best-Case Time <= Average-Case Time <= Worst-Case Time

11 Master Theorem Master Theorem : Let with a > 1 and b > 1. Then : Case 1: If f(n) = O ( n (log b a) -  ) for some  > o then T ( n ) =  ( n log b a ) Case 2: If f (n) =  (n log b a ) then T ( n ) =  (n log b a * log n ) Case 3: If f ( n ) =  (n ( log b a) +  ) for some  > o and if f( n/b) N 0 then T ( n ) =  ( f ( n ) ) Use ratio test to distinguish between cases: f(n)/ f(n)/ n log b a Look for “polynomially larger” dominance.

12 Master Theorem Regularity Condition:

13 CS Theory Math Review Sheet The Most Relevant Parts... ä p. 1  O, ,  definitions ä Series ä Combinations ä p. 2 Recurrences & Master Method ä p. 3 ä Probability ä Factorial ä Logs ä Stirling’s approx ä p. 4 Matrices ä p. 5 Graph Theory ä p. 6 Calculus ä Product, Quotient rules ä Integration, Differentiation ä Logs ä p. 8 Finite Calculus ä p. 9 Series Math fact sheet (courtesy of Prof. Costello) is on our web site.

14 Sorting Chapters 6-9 Heapsort, Quicksort, LinearTime-Sorting

15 Topics ä Sorting: Chapters 6-8 ä Sorting Algorithms: ä [Insertion & MergeSort)], Heapsort, Quicksort, LinearTime-Sorting ä Comparison-Based Sorting and its lower bound ä Breaking the lower bound using special assumptions ä Tradeoffs: Selecting an appropriate sort for a given situation ä Time vs. Space Requirements ä Comparison-Based vs. Non-Comparison-Based

16 Heaps & HeapSort ä Structure: ä Nearly complete binary tree ä Convenient array representation ä HEAP Property: (for MAX HEAP) ä Parent’s label not less than that of each child ä Operations: strategy worst-case run-time ä HEAPIFY: swap downO(h) [h= ht] ä INSERT: swap upO(h) ä EXTRACT-MAX: swap, HEAPIFY O(h) ä MAX: view rootO(1) ä BUILD-HEAP: HEAPIFY O(n)  HEAP-SORT: BUILD-HEAP, HEAPIFY  (nlgn) 16 1410 8793 241 1614108793241 1 2 3 4 5 6 7 8 9 10

17 QuickSort ä Divide-and-Conquer Strategy ä Divide: Partition array ä Conquer: Sort recursively ä Combine: No work needed ä Asymptotic Running Time:  Worst-Case:  (n 2 ) (partitions of size 1, n-1)  Best-Case:  (nlgn) (balanced partitions of size n/2)  Average-Case:  (nlgn) (balanced partitions of size n/2) ä Randomized PARTITION ä selects partition element randomly ä imposes uniform distribution Does most of the work on the way down (unlike MergeSort, which does most of work on the way back up (in Merge). 97324116141011 PARTITION Recursively sort right partition right partition left partition Recursively sort left partition

18 Comparison-Based Sorting In algebraic decision tree model, comparison-based sorting of n items requires  (n lg n) worst-case time. HeapSort To break the lower bound and obtain linear time, forego direct value comparisons and/or make stronger assumptions about input. InsertionSort MergeSort QuickSort  (n)  (n 2 ) BestCaseAverageCaseWorstCase Time: Algorithm:  (n lg n)  (n lg n)  (n lg n)  (n lg n)*  (n lg n)  (n lg n)  (n lg n)  (n 2 ) (*when all elements are distinct)

19 Non-Comparison-Based Sorting and Hybrid Sorting  (nlgn) Comparison-Based Sorting: Insertion-Sort, Merge-Sort, Heap-Sort, Quick-Sort Counting-Sort: Stable sort. Worst-case time in O(n+k), where k=largest input value If k in O(n), then time is in O(n). Extra storage in O(n+k). Radix-Sort: Hybrid: Uses a stable sort (e.g. Counting-Sort). Worst-case time in O(d(n+k)), Worst-case time in O(d(n+k)), where k=largest input value and d = number of digits. where k=largest input value and d = number of digits. If k in O(n) and d in O(1), then time is in O(n). If k in O(n) and d in O(1), then time is in O(n). Bucket-Sort: Hybrid: Uses a sort (e.g. Insertion-Sort) in each bucket. Average-case time in O(n) assuming numbers uniform in [0,1) and Average-case time in O(n) assuming numbers uniform in [0,1) and n buckets. n buckets.

20 Data Structures Chapters 10-13 Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced Trees

21 Topics ä Data Structures: Chapters 10-13 ä Abstract Data Types: their properties/invariants ä Stacks, Queues, LinkedLists, (Heaps from Chapter 6), Trees, HashTables, Binary Search Trees, Balanced (Red/Black) Trees ä Implementation/Representation choices -> data structure ä Dynamic Set Operations: ä Query [does not change the data structure] ä Search, Minimum, Maximum, Predecessor, Successor ä Manipulate: [can change data structure] ä Insert, Delete ä Running Time & Space Requirements for Dynamic Set Operations for each Data Structure ä Tradeoffs: Selecting an appropriate data structure for a situation ä Time vs. Space Requirements ä Representation choices ä Which operations are crucial?

22 Hash Table ä Structure: ä n << N (number of keys in table much smaller than size of key universe) ä Table with m elements ä m typically prime ä Hash Function: ä Not necessarily a 1-1 mapping ä Uses mod m to keep index in table ä Collision Resolution: ä Chaining: linked list for each table entry ä Open addressing: all elements in table ä Linear Probing: ä Quadratic Probing: Load Factor: Example:

23 Linked Lists ä Types ä Singly vs. Doubly linked ä Pointer to Head and/or Tail ä NonCircular vs. Circular ä Type influences running time of operations 9 4 3 / head9 4 3 / head tail9 4 3 head

24 Binary Tree Traversal ä “Visit” each node once  Running time in  (n) for an n-node binary tree ä Preorder: ABDCEF ä Visit node ä Visit left subtree ä Visit right subtree ä Inorder: DBAEFC ä Visit left subtree ä Visit node ä Visit right subtree ä Postorder: DBFECA ä Visit left subtree ä Visit right subtree ä Visit node B E C F D A

25 Binary Search Tree B D F E A C ä Structure: ä Binary tree ä BINARY SEARCH TREE Property: ä For each pair of nodes u, v: ä If u is in left subtree of v, then key[u] <= key[v] ä If u is in right subtree of v, then key[u] >= key[v] ä Operations: strategy worst-case run-time ä TRAVERSAL: INORDER, PREORDER, POSTORDER O(h) [h= ht] ä SEARCH: traverse 1 branch using BST property O(h) ä INSERT: search O(h) ä DELETE: splice out (cases depend on # children) O(h) ä MIN: go left O(h) ä MAX: go right O(h) ä SUCCESSOR: MIN if rt subtree; else go up O(h) ä PREDECESSOR: analogous to SUCCESSOR O(h) ä Navigation Rules ä Left/Right Rotations that preserve BST property

26

27 Red-Black Tree Properties ä Every node in a red-black tree is either black or red ä Every null leaf is black ä No path from a leaf to a root can have two consecutive red nodes -- i.e. the children of a red node must be black ä Every path from a node, x, to a descendant leaf contains the same number of black nodes -- the “black height” of node x. newly inserted node

28 Graph Algorithms Chapter 22 DFS/BFS Traversals, Topological Sort

29 Topics ä Graph Algorithms: Chapter 22 ä Undirected, Directed Graphs ä Connected Components of an Undirected Graph ä Representations: Adjacency Matrix, Adjacency List ä Traversals: DFS and BFS ä Differences in approach: DFS: LIFO/stack vs. BFS:FIFO/queue ä Forest of spanning trees ä Vertex coloring, Edge classification: tree, back, forward, cross ä Shortest paths (BFS) ä Topological Sort ä Tradeoffs: ä Representation Choice: Adjacency Matrix vs. Adjacency List ä Traversal Choice: DFS or BFS

30 Introductory Graph Concepts: Representations B E C F D A B E C F D A ä Undirected Graph ä Directed Graph (digraph) A B C D E F ABCDEF ABCDEF A BC B ACEF C AB D E E BDF F BE A BC B CEF C D D E BD F E Adjacency Matrix Adjacency List Adjacency Matrix Adjacency List

31 SEARCHING Elementary Graph Algorithms: SEARCHING: DFS, BFS ä Breadth-First-Search (BFS): ä BFS  vertices close to v are visited before those further away  FIFO structure  queue data structure ä Shortest Path Distance ä From source to each reachable vertex ä Record during traversal ä Foundation of many “shortest path” algorithms See DFS, BFS Handout for PseudoCode ä Depth-First-Search (DFS): ä DFS backtracks  visit most recently discovered vertex  LIFO structure  stack data structure ä Encountering, finishing times : “well- formed” nested (( )( ) ) structure ä DFS of undirected graph produces only back edges or tree edges ä Directed graph is acyclic if and only if DFS yields no back edges for unweighted directed or undirected graph G=(V,E) Time: O(|V| + |E|) adj listO(|V| 2 ) adj matrix predecessor subgraph = forest of spanning trees Vertex color shows status: not yet encountered encountered, but not yet finished finished

32 Elementary Graph Algorithms: DFS, BFS ä Review problem: TRUE or FALSE? ä The tree shown below on the right can be a DFS tree for some adjacency list representation of the graph shown below on the left. B E C F D A A C B E D F Tree Edge Cross Edge Back Edge

33 Elementary Graph Algorithms: Topological Sort source: 91.503 textbook Cormen et al. TOPOLOGICAL-SORT(G) 1 DFS(G) computes “finishing times” for each vertex 2 as each vertex is finished, insert it onto front of list 3 return list for Directed, Acyclic Graph (DAG) G=(V,E) Produces linear ordering of vertices. For edge (u,v), u is ordered before v. See also 91.404 DFS/BFS slide show

34 Minimum Spanning Tree: Greedy Algorithms A B C D E F G 2 2 1 1 3 4 4 5 6 6 8 7 source: 91.503 textbook Cormen et al. for Undirected, Connected, Weighted Graph G=(V,E) Produces minimum weight tree of edges that includes every vertex. Invariant: Minimum weight spanning forest Becomes single tree at end Invariant: Minimum weight tree Spans all vertices at end Time: O(|E|lg|E|) given fast FIND-SET, UNION Time: O(|E|lg|V|) = O(|E|lg|E|) slightly faster with fast priority queue

35 Graph Algorithms: Shortest Path 1 2 3 4 6 5 10 1 5 4 3 31 2 6 1 1 8 Dijkstra’s algorithm maintains a set S of vertices whose final shortest path weights have already been determined. It also maintains, for each vertex v not in S, an upper bound d[v] on the weight of a shortest path from source s to v. Dijkstra’s algorithm solves this problem efficiently for the case in which all weights are nonnegative (as in the example graph). The algorithm repeatedly selects the vertex u  V – S with minimum bound d[u], inserts u into S, and relaxes all edges leaving u (determines if passing through u makes it “faster” to get to a vertex adjacent to u).


Download ppt "UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Final Review."

Similar presentations


Ads by Google