Download presentation
Presentation is loading. Please wait.
1
UMass Lowell Computer Science 91.404 Analysis of Algorithms Prof. Karen Daniels Spring, 2001 Final Review Mon. 5/14-Wed. 5/16
2
Overview of Next 2 Lectures ä Review of some key course material ä Final Exam: ä Course Grade ä Logistics, Coverage, Format ä Handout for basis of 40% of test ä Course Evaluations
3
Review of Key Course Material
4
What’s It All About? ä Algorithm: ä steps for the computer to follow to solve a problem ä Problem Solving Goals: ä recognize structure of some common problems ä understand important characteristics of algorithms to solve common problems ä select appropriate algorithm & data structures to solve a problem ä tailor existing algorithms ä create new algorithms
5
Some Algorithm Application Areas Computer Graphics Geographic Information Systems Robotics Bioinformatics Astrophysics Medical Imaging Telecommunications Design Apply Analyze
6
Tools of the Trade ä Algorithm Design Patterns such as: ä binary search ä divide-and-conquer ä Data Structures such as: ä trees, linked lists, hash tables, graphs ä Theoretical Computer Science principles such as: ä NP-completeness, hardness Growth of Functions Summations Recurrences Sets Probability MATH Proofs
7
Discrete Math Review Chapters 1-6 Growth of Functions, Summations, Recurrences, Sets, Counting, Probability
8
Topics ä Discrete Math Review : Chapters 1-6 ä Solving Summations & Recurrences ä Sets, Basic Tree & Graph concepts ä Counting: Permutations/Combinations ä Probability: Basics, including Expectation of a Random Variable ä Proof Techniques: Induction ä Basic Algorithm Analysis Techniques: Chapters 1-6 ä Asymptotic Growth of Functions ä Types of Input: Best/Average/Worst ä Bounds on Algorithm vs. Bounds on Problem ä Algorithmic Paradigms/Design Patterns: Divide-and-Conquer ä Analyze pseudocode running time to form summations &/or recurrences
9
What are we measuring? ä Some Analysis Criteria: ä Scope ä The problem itself? ä A particular algorithm that solves the problem? ä “Dimension” ä Time Complexity? Space Complexity? ä Type of Bound ä Upper? Lower? Both? ä Type of Input ä Best-Case? Average-Case? Worst-Case? ä Type of Implementation ä Choice of Data Structure
10
Function Order of Growth O( ) upper bound ( ) lower bound ( ) upper & lower bound n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 know how to use asymptotic complexity notation to describe time or space complexity know how to order functions asymptotically (behavior as n becomes large)
11
Types of Algorithmic Input Best-Case Input: of all possible algorithm inputs of size n, it generates the “best” result for Time Complexity: “best” is smallest running time for Time Complexity: “best” is smallest running time Best-Case Input Produces Best-Case Running Time Best-Case Input Produces Best-Case Running Time provides a lower bound on the algorithm’s asymptotic running time provides a lower bound on the algorithm’s asymptotic running time (subject to any implementation assumptions) (subject to any implementation assumptions) for Space Complexity: “best” is smallest storage for Space Complexity: “best” is smallest storage Average-Case Input Worst-Case Input these are defined similarly Best-Case Time <= Average-Case Time <= Worst-Case Time
12
Bounding Algorithmic Time (using cases) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T(n) = (1) T(n) = (2 n ) very loose bounds are not very useful! Worst-Case time of T(n) = (2 n ) tells us that worst-case inputs cause the algorithm to take at most exponential time (i.e. exponential time is sufficient). But, can the algorithm every really take exponential time? (i.e. is exponential time necessary?) If, for arbitrary n, we find a worst-case input that forces the algorithm to use exponential time, then this tightens the lower bound on the worst-case running time. If we can force the lower and upper bounds on the worst-case time to match, then we can say that, for the worst-case running time, T(n) = (2 n ) (i.e. we’ve found the minimum upper bound, so the bound is tight.) Using “case” we can discuss lower and/or upper bounds on: best-case running time or average-case running time or worst-case running time
13
Bounding Algorithmic Time (tightening bounds) n 1 n lg(n) n lg 2 (n) 2n2n2n2n n5n5n5n5 lg(n) lg(n)lglg(n) n2n2n2n2 T W (n) = (2 n ) for example... Here we denote best-case time by T B (n); worst-case time by T W (n) T B (n) = (1) 1st attempt T B (n) = (n) 1st attempt 2nd attempt T W (n) = (n 2 ) T B (n) = (n) 2nd attempt 1st attempt T W (n) = (n 2 ) Algorithm Bounds
14
ä Explore the problem to gain intuition: ä Describe it: What are the assumptions? (model of computation, etc...) ä Has it already been solved? ä Have similar problems been solved? (more on this later) ä What does best-case input look like? ä What does worst-case input look like? ä Establish worst-case upper bound on the problem using an algorithm ä Design a (simple) algorithm and find an upper bound on its worst-case asymptotic running time; this tells us problem can be solved in a certain amount of time. Algorithms taking more than this amount of time may exist, but won’t help us. ä Establish worst-case lower bound on the problem ä Tighten each bound to form a worst-case “sandwich” Approach increasing worst-case asymptotic running time as a function of n n 1 2n2n2n2n n2n2n2n2 n3n3n3n3 n4n4n4n4 n5n5n5n5
15
Know the Difference! n 1 2n2n2n2n n5n5n5n5 worst-case bounds on problem on problem An inefficient algorithm for the problem might exist that takes this much time, but would not help us. No algorithm for the problem exists that can solve it for worst-case inputs in less than linear time. Strong Bound: This worst-case lower bound on the problem holds for every algorithm that solves the problem and abides by our problem’s assumptions. Weak Bound: This worst-case upper bound on the problem comes from just considering one algorithm. Other, less efficient algorithms that solve this problem might exist, but we don’t care about them! Both the upper and lower bounds are probably loose (i.e. probably can be tightened later on).
16
Master Theorem Master Theorem : Let with a > 1 and b > 1. Then : Case 1: If f(n) = O ( n (log b a) - ) for some > o then T ( n ) = ( n log b a ) Case 2: If f (n) = (n log b a ) then T ( n ) = (n log b a * log n ) Case 3: If f ( n ) = (n (log b a) + ) for some > o and if a f( n/b) N 0 then T ( n ) = ( f ( n ) ) Use ratio test to distinguish between cases: f(n)/ f(n)/ n log b a Look for “polynomially larger” dominance.
17
Sorting Chapters 7-10 Heapsort, Quicksort, LinearTime-Sorting, Medians
18
Topics ä Sorting: Chapters 7-10 ä Sorting Algorithms: ä [Insertion & MergeSort from Chapters 1-6)], Heapsort, Quicksort, LinearTime-Sorting, Medians ä Comparison-Based Sorting and its lower bound ä Breaking the lower bound using special assumptions ä Tradeoffs: Selecting an appropriate sort for a given situation ä Time vs. Space Requirements ä Comparison-Based vs. Non-Comparison-Based
19
Comparison-Based Sorting In algebraic decision tree model, comparison-based sorting of n items requires (n lg n) time. HeapSort To breaking the lower bound and obtain linear time, forego direct value comparisons and/or make stronger assumptions about input. InsertionSort MergeSort QuickSort (n lg n) (n 2 ) BestCaseAverageCaseWorstCase Time: Algorithm: (n lg n) (n lg n) (n lg n) (n lg n) (n lg n) (n lg n) (n 2 )
20
Data Structures Chapters 11-14 Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced Trees
21
Topics ä Data Structures: Chapters 11-14 ä Abstract Data Types: their properties/invariants ä Stacks, Queues, LinkedLists, (Heaps from Chapter 7), Trees, HashTables, Binary Search Trees, Balanced (Red/Black) Trees ä Implementation/Representation choices -> data structure ä Dynamic Set Operations: ä Query [does not change the data structure] ä Search, Minimum, Maximum, Predecessor, Successor ä Manipulate: [can change data structure] ä Insert, Delete ä Running Time & Space Requirements for Dynamic Set Operations for each Data Structure ä Tradeoffs: Selecting an appropriate data structure for a situation ä Time vs. Space Requirements ä Representation choices ä Which operations are crucial?
23
Advanced Techniques Chapters 16-17 Dynamic Programming, Greedy Algorithms
24
Topics ä Advanced Techniques: Chapters 16-17 ä Algorithmic Paradigms/Design Patterns: ä Divide-and-Conquer ä Dynamic Programming ä Greedy Algorithms ä Brute-Force/Naive ä Using Dynamic Programming &/or Greedy Algorithms to solve Optimization Problems ä Optimal Substructure ä Greedy Choice Property: Locally optimal -> Globally optimal ä Tradeoffs: ä Selecting an appropriate paradigm to solve a problem ä Tackling a problem using a sequence of paradigms: ä Brute-Force (high running time) then improve...
25
Problem Characteristics Modular Independent pieces Divide-and-Conquer Dynamic Programming Greedy Algorithms ModularOptimization Optimal substructure: optimal solution contains optimal solutions to subproblems Overlapping subproblems ModularOptimization Optimal substructure: optimal solution contains optimal solutions to subproblems Greedy choice property: locally optimal choices lead to global optimum
26
Graph Algorithms Chapters 23-25 DFS/BFSTraversals, Topological Sort, MinimumSpanningTrees, Shortest Paths
27
Topics ä Graph Algorithms: Chapters 23-25 ä Undirected, Directed Graphs ä Connected Components of an Undirected Graph ä Representations: Adjacency Matrix, Adjacency List ä Traversals: DFS and BFS ä Differences in approach: DFS: LIFO/stack vs. BFS:FIFO/queue ä Forest of spanning trees ä Vertex coloring, Edge classification: tree, back, forward, cross ä Shortest paths (BFS) ä Topological Sort ä Weighted Graphs ä MinimumSpanningTrees: 2 different approaches ä Shortest Paths: Single source: Dijkstra’s algorithm ä Tradeoffs: ä Representation Choice: Adjacency Matrix vs. Adjacency List ä Traversal Choice: DFS or BFS
28
Traversals: DFS, BFS ä DFS backtracks visit most recently discovered vertex LIFO structure stack data structure ä BFS vertices close to v are visited before those further away FIFO structure queue data structure
29
FINAL EXAM Logistics, Coverage, Format Handout for basis of 40% of test
30
Course Grading ä Homework 40% ä Exam 1 15% (closed book) ä Midterm 20% (open book) ä Final Exam 25% (open book) Results are scaled if necessary. Check grade status with us before final!
31
Final Exam: Logistics ä Friday, 12/18 ä Olsen 311: 8:00-11:00 a.m. ä Open book, open notes ä Closed computers, neighbors ä Cumulative ä Worth 25% of grade Note change from registrar’s room number
32
Text/Chapter/Topic Coverage ä Discrete Math Review & Basic Algorithm Analysis Techniques : Chapters 1-6 ä Summations, Recurrences, Sets, Trees, Graph, Counting, Probability, Growth of Functions, Divide-and-Conquer ä Sorting: Chapters 7-10 ä Heapsort, Quicksort, LinearTime-Sorting, Medians ä Data Structures: Chapters 11-14 ä Stacks, Queues, LinkedLists, Trees, HashTables, Binary Search Trees, Balanced (Red/Black) Trees ä Advanced Techniques: Chapters 16-17 ä Dynamic Programming, Greedy Algorithms ä Graph Algorithms: Chapters 23-25 ä Traversal, MinimumSpanningTrees, Shortest Paths
33
Format ä Mixture of questions of the following types: 1) Multiple Choice 2) True/False 3) Short Answer 4) Analyze Pseudo-Code and/or Data Structure 5) Solve a Problem by Designing an Algorithm ä Select an appropriate paradigm/ design pattern ä Select appropriate data structures ä Write pseudo-code ä Justify correctness ä Analyze asymptotic complexity 60% 40%
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.