Download presentation
Presentation is loading. Please wait.
Published byMabel McCarthy Modified over 9 years ago
1
COMP550: Algorithms & Analysis UNC Chapel Hill Mon/Wed/Fri 9:05am - 9:55am (FB 009) http://www.cs.unc.edu/~zsguo/comp550.html Zhishan Guo SN 126 zsguo@cs.unc.edu Office Hours: After Class on Mon. & Wed. Or By Appointment Z. Guo
2
Solving a Computational Problem Problem definition & specification – specify input, output and constraints Algorithm design & analysis – devise a correct & efficient algorithm Implementation planning Coding, testing and verification UNC Chapel Hill Our Focus
3
Primary Focus Develop thinking ability – formal thinking (proof techniques & analysis) – problem solving skills (algorithm design and application) UNC Chapel Hill About Coding/Programming…
4
Goals Be very familiar with a collection of core algorithms. Be fluent in algorithm design paradigms: divide & conquer, greedy algorithms, randomization, dynamic programming, approximation methods. Be able to analyze the correctness and runtime performance of a given algorithm. Be familiar with the inherent complexity (lower bounds & intractability) of some problems. Be intimately familiar with basic data structures. Be able to apply techniques in practical problems. UNC Chapel Hill
5
What Will We Be Doing Examine interesting problems Devise algorithms for solving them Prove their correctness Analyze their runtime performance Study data structures & core algorithms Learn problem-solving techniques Applications in real-world problems UNC Chapel Hill The inverted Classroom Style will be very limited.
6
Congressional Apportionment Article I, Section 2 of the United States Constitution requires that Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers... The Number of Representatives shall not exceed one for every 30,000, but each State shall have at Least one Representative... UNC Chapel Hill
7
The Huntington-Hill Method UNC Chapel Hill Currently, n = 50 and R = 435.
8
Pseudo-code Well-written pseudocode reveals the internal structure of the algorithm but hides irrelevant implementation details, making the algorithm much easier to understand, analyze, debug, and implement. The precise syntax of pseudocode is a personal choice, but the overriding goal should be clarity and precision. Ideally, pseudocode should allow any competent programmer to implement the underlying algorithm, quickly and correctly, in their favorite programming language, without understanding why the algorithm works. UNC Chapel Hill
9
Algorithm Analysis Asymptotic Notion Recurrence Relations Proof Techniques Inherent Complexity UNC Chapel Hill
10
Algorithmic Design Paradigms Divide and Conquer Greedy Algorithms Graph Algorithms Dynamic Programming Approximation Methods (if time allows) UNC Chapel Hill
11
Textbook & References Introduction to Algorithms, 3 rd Ed. by Cormen, Leiserson, Rivest, & Stein (CLRS), McGraw Hill, 2009. Lecture slides will be put online – Thanks to Profs. Mark Foskey, Ming Lin, Dinesh Manocha, Ketan Mayer- Patel, David Plaisted, Jack Snoeyink, Dr. Umamaheswari Devi, and Prof. Jeff Erickson (UIUC). OTHER REFERENCES: Algorithmics: The Spirit of Computing, Harel How to Solve It, Polya. The Design and Analysis of Computer Algorithms, Aho, Hopcroft and Ullman. Algorithms, Sedgewick. Algorithmics: Theory & Practice, Brassard & Bratley. Writing Efficient Programs & Programming Pearls, Bentley. The Science of Programming, by Gries. The Craft of Programming, by Reynolds. UNC Chapel Hill
12
Prerequisites Data Structure + Discrete Mathematics Assume that you know, or can recall with a quick review, the materials in the following chapters. – Chapter 0, 1, and 2 – Section 3.2: growth of functions – Chapter 10: elementary data structures UNC Chapel Hill
13
Course Roadmap Algorithmic Basics (4) Divide and Conquer (5-6) Randomized Algorithms (9-11) Search Trees (5) Graph Algorithms (5) Dynamic Programming (3) Greedy Algorithms (6) Special Topics* (1-2) M13+W15+F14 = 42 UNC Chapel Hill
14
Algorithmic Basics (4) Introduction to algorithms, complexity, and proof of correctness. (Chapters 1 & 2) Asymptotic Notation. (Chapter 3.1) Goals – Know how to write formal problem specifications. – Know about computational models. – Know how to measure the efficiency of an algorithm. – Know the difference between upper and lower bounds and what they convey. – Be able to prove algorithms correct and establish computational complexity. UNC Chapel Hill
15
Divide-and-Conquer (5-6) Designing Algorithms. (Chapter 2.3) Recurrences. (Chapter 4) Quicksort. (Chapter 7) Goals – Know when the divide-and-conquer paradigm is an appropriate one. – Know the general structure of such algorithms. – Express their complexity using recurrence relations. – Determine the complexity using techniques for solving recurrences. – Memorize the common-case solutions for recurrence relations. UNC Chapel Hill
16
Randomized Algorithms (9-11) Probability & Combinatorics. (Chapter 5) Randomized Quicksort. (Chapter 7) Linear Time Sorting*. (Chapter 8) Selection (Chapter 9) Hash Tables. (Chapter 11) Goals – Be thorough with basic probability theory and counting theory. – Be able to apply the theory of probability to the following. Design and analysis of randomized algorithms and data structures. Average-case analysis of deterministic algorithms. – Understand the difference between average-case and worst- case runtime, esp. in sorting and hashing. UNC Chapel Hill
17
Search Trees (5) Binary Search Trees – Not balanced (Chapter 12) Red-Black Trees – Balanced (Chapter 13) Goals – Know the characteristics of the trees. – Know the capabilities and limitations of simple binary search trees. – Know why balancing heights is important. – Know the fundamental ideas behind maintaining balance during insertions and deletions. – Be able to apply these ideas to other balanced tree data structures. UNC Chapel Hill
18
Graph Algorithms (5) Basic Graph Algorithms (Chapter 22) Breadth-First Search Depth-First Search & Applications Goals – Know how to represent graphs (adjacency matrix and edge-list representations). – Know the basic techniques for graph searching. – Be able to devise other algorithms based on graph- searching algorithms. – Be able to “cut-and-paste” proof techniques as seen in the basic algorithms. UNC Chapel Hill
19
Dynamic Programming (3) Dynamic Programming (Chapter 15) Goals – Know when to apply dynamic programming and how it differs from divide and conquer. – Be able to systematically move from one to the other. UNC Chapel Hill
20
Greedy Algorithms (6) Greedy Algorithms (Chapter 16) Minimum Spanning Trees (Chapter 23) Shortest Paths (Chapter 24) Goals – Know when to apply greedy algorithms and their characteristics. – Be able to prove the correctness of a greedy algorithm in solving an optimization problem. – Understand where minimum spanning trees and shortest path computations arise in practice. UNC Chapel Hill
21
Linear Programming & NP-Completeness* (1-2) Simplex Algorithm Duality NP-Hardness Approximation GOAL: Gain some knowledge of basic concepts and algorithms UNC Chapel Hill
22
Course Work & Grades Homework: 25% (total of 6-9, mostly design & analysis) Quizzes: 5% (very basic materials) Mid-Term Exams: 40% (total of 3, 50 min each, in class) Final Exam: 30% Active Class Participation: up to half a grade bonus Programming Projects: up to 10% bonus UNC Chapel Hill
23
Examinations Quizzes: throughout the semester Midterms: 3 in class Final: 8am - 11am on May 4th, 2015 “Half” closed book, NO collaboration UNC Chapel Hill
24
Homework Assignments Due at the beginning of each class on the due date given No late homework will be accepted Lowest score will be dropped Can discuss in group, but must write/formulate solutions alone (failure to explain your solution orally to the instructor = cheat) Be neat, clear, precise, formal – You’ll be graded on correctness, simplicity, elegance & clarity UNC Chapel Hill
25
Course Project Due on 23:59 Apr 27 (last day of class) No late project report or code will be accepted You are responsible for defining/proposing the course project. You are encouraged to discuss with me and/or submit a proposal earlier than Mar 30. It can be either some implementations of core algorithms we cover (you are responsible for showing the correctness of your code), or some algorithmic study to any open problem. Group work is allowed though contribution of each member must be clarified in the final report. UNC Chapel Hill
26
Communication Visit instructor during office hours, by appointment, or email correspondence All lecture notes and most of handouts are posted at the course website: http://www.cs.unc.edu/~zsguo/comp550.html Major messages are notified by email alias + on course website Discussions -- face-to-face in groups, or on Piazza Student grades can be checked with the instructor UNC Chapel Hill
27
Basic Courtesy Write/print assignments neatly & formally Please do not read newspaper & other materials, or browse web in class When coming to the class late or leaving early, please take an aisle seat quietly Remain quiet, except asking questions or answering questions posed by instructors – no whispers or private conversation THANK YOU!!! UNC Chapel Hill
28
How to Succeed in this Course Start early on all assignments. DON'T procrastinate. Complete all reading before class. Participate in class. Think in class. Review after each class. Be formal and precise on all problems sets and exams UNC Chapel Hill
29
Weekly Reading Assignment Chapters 0, 1, 2, 3 and Appendix A (Textbook: CLRS) UNC Chapel Hill
30
Algorithms A tool for solving a well-specified computational problem Example: sorting input: A sequence of number output: An ordered permutation of input issues: correctness, efficiency, storage, etc. UNC Chapel Hill AlgorithmInput Output
31
for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Example: Insertion Sort UNC Chapel Hill
32
Correctness Proofs Proving (beyond “any” doubt) that an algorithm is correct. – Prove that the algorithm produces correct output when it terminates. Partial Correctness. – Prove that the algorithm will necessarily terminate. Total Correctness. Techniques – Proof by Construction. – Proof by Induction. – Proof by Contradiction. UNC Chapel Hill
33
Loop Invariant Logical expression with the following properties. – Holds true before the first iteration of the loop – Initialization. – If it is true before an iteration of the loop, it remains true before the next iteration – Maintenance. – When the loop terminates, the invariant ― along with the fact that the loop terminated ― gives a useful property that helps show that the loop is correct – Termination. Similar to mathematical induction. UNC Chapel Hill
34
Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged. for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Example: Insertion Sort UNC Chapel Hill
35
Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Initialization: j = 2, the invariant trivially holds because A[1] is a sorted array. √ Example: Insertion Sort UNC Chapel Hill
36
Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Maintenance: the inner while loop finds the position i with A[i] <= key, and shifts A[j-1], A[j-2], …, A[i+1] right by one position. Then key, formerly known as A[j], is placed in position i+1 so that A[i] A[i+1] A[i+2]. A[1…j-1] sorted + A[j] A[1…j] sorted Example: Insertion Sort UNC Chapel Hill
37
Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Termination: the loop terminates, when j=n+1. Then the invariant states: “A[1…n] consists of elements originally in A[1…n] but in sorted order.” √ Example: Insertion Sort UNC Chapel Hill
38
Running time Depends on input (e.g., sorted/reversely) Depends on input size (5 elements vs 500K) – Parameterize in input size (n) Want upper bounds (generally) – Guarantee to the user UNC Chapel Hill for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key
39
Analysis Worst-Case (usually) T(n) = max time on any input of size n Average-Case (sometimes) T(n) = expected time over all inputs of size n (Need assumption of…) Best-Case UNC Chapel Hill
40
Analyzing Algorithms Assumptions – generic one processor, random access machine – running time (others: memory, communication, etc) Worst Case Running Time: the longest time for any input size of n – upper bound on the running time for any input – in some cases like searching, this is close Average Case Behavior: the expected performance averaged over all possible input – it is generally better than worst case behavior – sometimes it’s roughly as bad as worst case UNC Chapel Hill
41
A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. LinearSearch(A, key) 1 i 1 2 while i ≤ n and A[i] != key 3 do i++ 4if i n 5 then return true 6 else return false
42
A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. LinearSearch(A, key) cost times 1 i 1 c 1 1 2 while i ≤ n and A[i] != key c 2 x 3 do i++ c 3 x-1 4if i n c 4 1 5 then return true c 5 1 6 else return false c 6 1 x ranges between 1 and n+1. So, the running time ranges between c 1 + c 2 + c 4 + c 5 – best case and c 1 + c 2 (n+1)+ c 3 n + c 4 + c 6 – worst case
43
A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. Assign a cost of 1 to all statement executions. Now, the running time ranges between 1+ 1+ 1 + 1 = 4 – best case and 1+ (n+1)+ n + 1 + 1 = 2n+4 – worst case LinearSearch(A, key) cost times 1 i 1 1 1 2 while i ≤ n and A[i] != key 1 x 3 do i++ 1 x-1 4if i n 1 1 5 then return true 1 1 6 else return false 1 1
44
A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. If we assume that we search for a random item in the list, on an average, Statements 2 and 3 will be executed n/2 times. Running times of other statements are independent of input. Hence, average-case complexity is 1+ n/2+ n/2 + 1 + 1 = n+3 LinearSearch(A, key) cost times 1 i 1 1 1 2 while i ≤ n and A[i] != key 1 x 3 do i++ 1 x-1 4if i n 1 1 5 then return true 1 1 6 else return false 1 1
45
Correctness Proof of Linear Search Use Loop Invariant for the while loop: LinearSearch(A, key) 1 i 1 2 while i ≤ n and A[i] != key 3 do i++ 4if i n 5 then return true 6 else return false If the algm. terminates, then it produces correct result. Initialization. Maintenance. Termination. Argue that it terminates. UNC Chapel Hill
46
Correctness Proof of Linear Search Use Loop Invariant for the while loop: – At the start of each iteration of the while loop, the search key is not in the subarray A[1…i-1]. LinearSearch(A, key) 1 i 1 2 while i ≤ n and A[i] != key 3 do i++ 4if i n 5 then return true 6 else return false If the algm. terminates, then it produces correct result. Initialization. Maintenance. Termination. Argue that it terminates. UNC Chapel Hill
47
Worst-Case time & Order of Growth Depends on computer (software vs. hardware) BIG IDEA - Asymptotic analysis – Ignore machine dependent constants – Look at the growth of T(n) as n -> +inf – Notation: we can ignore the lower-order terms, since they are relatively insignificant for very large n. We can also ignore leading term’s constant coefficients, since they are not as important for the rate of growth in computational efficiency for very large n. UNC Chapel Hill
48
Comparisons of Algorithms Sorting – insertion sort: ( n 2 ) – merge sort: ( n log n ) For 10 6 numbers, insertion sort takes 5.56 hrs on a supercomputer using machine language and 16.67 min on a PC using C/C++ with merge sort. Why Order of Growth Matters? Computer speeds double every two years… UNC Chapel Hill
49
Effect of faster machines UNC Chapel Hill The number of items that can be sorted in one second using an algorithm taking exactly n 2 time as compared to one taking n lg n time, assuming 1 million and 2 million operations per second. Notice that, for the n lg n algorithm, doubling the speed almost doubles the number of items that can be sorted. (Order of growth matters!) Ops/sec:1M2MGain n*n alg100014141.4 n log n alg627001186001.9
Similar presentations
© 2025 SlidePlayer.com. Inc.
All rights reserved.