COMP550: Algorithms & Analysis UNC Chapel Hill Mon/Wed/Fri 9:05am - 9:55am (FB 009) Zhishan Guo SN 126

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Introduction to Algorithms 6.046J Lecture 1 Prof. Shafi Goldwasser Prof. Erik Demaine.
Chapter 2. Getting Started. Outline Familiarize you with the to think about the design and analysis of algorithms Familiarize you with the framework to.
A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님 1.
2. Getting started Hsu, Lih-Hsing. Computer Theory Lab. Chapter 2P Insertion sort Example: Sorting problem Input: A sequence of n numbers Output:
Spring 2015 Lecture 5: QuickSort & Selection
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2001 Midterm Review Fri. Oct 26.
What is an Algorithm? (And how do we analyze one?)
Jan Welcome to the Course of Advanced Algorithm Design (ACS-7101/3)
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2009 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Th. 9/3/2009.
CS421 - Course Information Website Syllabus Schedule The Book:
COMP 122 – Design and Analysis of Algorithms Spring 2004 MW 11:00-12:15, SN 014 Instructor:Jack Snoeyink TA: Nathan Fisher SN.
CSE 830: Design and Theory of Algorithms
CSE 830: Design and Theory of Algorithms
CSE 830: Design and Theory of Algorithms Dr. Eric Torng.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
Fall 2008 Insertion Sort – review of loop invariants.
UMass Lowell Computer Science Analysis of Algorithms Prof. Karen Daniels Fall, 2005 Lecture 1 Introduction/Overview Text: Chapters 1, 2 Wed. 9/7/05.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
What is an Algorithm? (And how do we analyze one?) COMP 122, Spring 04.
DAST, Spring © L. Joskowicz 1 Data Structures – LECTURE 1 Introduction Motivation: algorithms and abstract data types Easy problems, hard problems.
Introduction to Algorithm design and analysis
Computer Algorithms Lecture 11 Sorting in Linear Time Ch. 8
Teaching Teaching Discrete Mathematics and Algorithms & Data Structures Online G.MirkowskaPJIIT.
CS223 Algorithms D-Term 2013 Instructor: Mohamed Eltabakh WPI, CS Introduction Slide 1.
Week 2 CS 361: Advanced Data Structures and Algorithms
Introduction to Algorithms Jiafen Liu Sept
Course Web Page Most information about the course (including the syllabus) will be posted on the course wiki:
Analysis of Algorithms
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Jessie Zhao Course page: 1.
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
Algorithms Lecture 1. Introduction The methods of algorithm design form one of the core practical technologies of computer science. The main aim of this.
Introduction to Algorithms Lecture 1. Introduction The methods of algorithm design form one of the core practical technologies of computer science. The.
ECOE 456/556: Algorithms and Computational Complexity Lecture 1 Serdar Taşıran.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Lecture 1: Introduction and Overview CSCI 700 – Algorithms 1.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
A Lecture /24/2015 COSC3101A: Design and Analysis of Algorithms Tianying Ji Lecture 1.
Major objective of this course is: Design and analysis of modern algorithms Different variants Accuracy Efficiency Comparing efficiencies Motivation thinking.
Getting Started Introduction to Algorithms Jeff Chastine.
1Computer Sciences Department. Book: Introduction to Algorithms, by: Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest Clifford Stein Electronic:
September 17, 2001 Algorithms and Data Structures Lecture II Simonas Šaltenis Nykredit Center for Database Research Aalborg University
IT 301: Algorithm Analysis Lecture-00 Jesmin Akhter Lecturer,IIT Jahangirnagar University, Savar, DhakaBangladesh 11/29/2015.
September 9, Algorithms and Data Structures Lecture II Simonas Šaltenis Nykredit Center for Database Research Aalborg University
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 5: Algorithms.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
CES 592 Theory of Software Systems B. Ravikumar (Ravi) Office: 124 Darwin Hall.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Design and Analysis of Algorithms Introduction Instructors:1. B V Kiran Mayee, 2. A Madhavi
1 Computer Algorithms Tutorial 2 Mathematical Induction Some of these slides are courtesy of D. Plaisted et al, UNC and M. Nicolescu, UNR.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
September 18, Algorithms and Data Structures Lecture II Simonas Šaltenis Aalborg University
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
CMPT 438 Algorithms.
Design and Analysis of Algorithms
Welcome to the Course of Advanced Algorithm Design
CPSC 311 Section 502 Analysis of Algorithm
CS 583 Fall 2006 Analysis of Algorithms
CS 3343: Analysis of Algorithms
Foundations II: Data Structures and Algorithms
CMPT 438 Algorithms Instructor: Tina Tian.
COMP 122 – Design and Analysis of Algorithms
Algorithms and Data Structures Lecture II
Presentation transcript:

COMP550: Algorithms & Analysis UNC Chapel Hill Mon/Wed/Fri 9:05am - 9:55am (FB 009) Zhishan Guo SN 126 Office Hours: After Class on Mon. & Wed. Or By Appointment Z. Guo

Solving a Computational Problem Problem definition & specification – specify input, output and constraints Algorithm design & analysis – devise a correct & efficient algorithm Implementation planning Coding, testing and verification UNC Chapel Hill Our Focus

Primary Focus Develop thinking ability – formal thinking (proof techniques & analysis) – problem solving skills (algorithm design and application) UNC Chapel Hill About Coding/Programming…

Goals Be very familiar with a collection of core algorithms. Be fluent in algorithm design paradigms: divide & conquer, greedy algorithms, randomization, dynamic programming, approximation methods. Be able to analyze the correctness and runtime performance of a given algorithm. Be familiar with the inherent complexity (lower bounds & intractability) of some problems. Be intimately familiar with basic data structures. Be able to apply techniques in practical problems. UNC Chapel Hill

What Will We Be Doing Examine interesting problems Devise algorithms for solving them Prove their correctness Analyze their runtime performance Study data structures & core algorithms Learn problem-solving techniques Applications in real-world problems UNC Chapel Hill The inverted Classroom Style will be very limited.

Congressional Apportionment Article I, Section 2 of the United States Constitution requires that Representatives and direct Taxes shall be apportioned among the several States which may be included within this Union, according to their respective Numbers... The Number of Representatives shall not exceed one for every 30,000, but each State shall have at Least one Representative... UNC Chapel Hill

The Huntington-Hill Method UNC Chapel Hill Currently, n = 50 and R = 435.

Pseudo-code Well-written pseudocode reveals the internal structure of the algorithm but hides irrelevant implementation details, making the algorithm much easier to understand, analyze, debug, and implement. The precise syntax of pseudocode is a personal choice, but the overriding goal should be clarity and precision. Ideally, pseudocode should allow any competent programmer to implement the underlying algorithm, quickly and correctly, in their favorite programming language, without understanding why the algorithm works. UNC Chapel Hill

Algorithm Analysis Asymptotic Notion Recurrence Relations Proof Techniques Inherent Complexity UNC Chapel Hill

Algorithmic Design Paradigms Divide and Conquer Greedy Algorithms Graph Algorithms Dynamic Programming Approximation Methods (if time allows) UNC Chapel Hill

Textbook & References Introduction to Algorithms, 3 rd Ed. by Cormen, Leiserson, Rivest, & Stein (CLRS), McGraw Hill, Lecture slides will be put online – Thanks to Profs. Mark Foskey, Ming Lin, Dinesh Manocha, Ketan Mayer- Patel, David Plaisted, Jack Snoeyink, Dr. Umamaheswari Devi, and Prof. Jeff Erickson (UIUC). OTHER REFERENCES:  Algorithmics: The Spirit of Computing, Harel  How to Solve It, Polya.  The Design and Analysis of Computer Algorithms, Aho, Hopcroft and Ullman.  Algorithms, Sedgewick.  Algorithmics: Theory & Practice, Brassard & Bratley.  Writing Efficient Programs & Programming Pearls, Bentley.  The Science of Programming, by Gries.  The Craft of Programming, by Reynolds. UNC Chapel Hill

Prerequisites Data Structure + Discrete Mathematics Assume that you know, or can recall with a quick review, the materials in the following chapters. – Chapter 0, 1, and 2 – Section 3.2: growth of functions – Chapter 10: elementary data structures UNC Chapel Hill

Course Roadmap Algorithmic Basics (4) Divide and Conquer (5-6) Randomized Algorithms (9-11) Search Trees (5) Graph Algorithms (5) Dynamic Programming (3) Greedy Algorithms (6) Special Topics* (1-2) M13+W15+F14 = 42 UNC Chapel Hill

Algorithmic Basics (4) Introduction to algorithms, complexity, and proof of correctness. (Chapters 1 & 2) Asymptotic Notation. (Chapter 3.1) Goals – Know how to write formal problem specifications. – Know about computational models. – Know how to measure the efficiency of an algorithm. – Know the difference between upper and lower bounds and what they convey. – Be able to prove algorithms correct and establish computational complexity. UNC Chapel Hill

Divide-and-Conquer (5-6) Designing Algorithms. (Chapter 2.3) Recurrences. (Chapter 4) Quicksort. (Chapter 7) Goals – Know when the divide-and-conquer paradigm is an appropriate one. – Know the general structure of such algorithms. – Express their complexity using recurrence relations. – Determine the complexity using techniques for solving recurrences. – Memorize the common-case solutions for recurrence relations. UNC Chapel Hill

Randomized Algorithms (9-11) Probability & Combinatorics. (Chapter 5) Randomized Quicksort. (Chapter 7) Linear Time Sorting*. (Chapter 8) Selection (Chapter 9) Hash Tables. (Chapter 11) Goals – Be thorough with basic probability theory and counting theory. – Be able to apply the theory of probability to the following. Design and analysis of randomized algorithms and data structures. Average-case analysis of deterministic algorithms. – Understand the difference between average-case and worst- case runtime, esp. in sorting and hashing. UNC Chapel Hill

Search Trees (5) Binary Search Trees – Not balanced (Chapter 12) Red-Black Trees – Balanced (Chapter 13) Goals – Know the characteristics of the trees. – Know the capabilities and limitations of simple binary search trees. – Know why balancing heights is important. – Know the fundamental ideas behind maintaining balance during insertions and deletions. – Be able to apply these ideas to other balanced tree data structures. UNC Chapel Hill

Graph Algorithms (5) Basic Graph Algorithms (Chapter 22) Breadth-First Search Depth-First Search & Applications Goals – Know how to represent graphs (adjacency matrix and edge-list representations). – Know the basic techniques for graph searching. – Be able to devise other algorithms based on graph- searching algorithms. – Be able to “cut-and-paste” proof techniques as seen in the basic algorithms. UNC Chapel Hill

Dynamic Programming (3) Dynamic Programming (Chapter 15) Goals – Know when to apply dynamic programming and how it differs from divide and conquer. – Be able to systematically move from one to the other. UNC Chapel Hill

Greedy Algorithms (6) Greedy Algorithms (Chapter 16) Minimum Spanning Trees (Chapter 23) Shortest Paths (Chapter 24) Goals – Know when to apply greedy algorithms and their characteristics. – Be able to prove the correctness of a greedy algorithm in solving an optimization problem. – Understand where minimum spanning trees and shortest path computations arise in practice. UNC Chapel Hill

Linear Programming & NP-Completeness* (1-2) Simplex Algorithm Duality NP-Hardness Approximation GOAL: Gain some knowledge of basic concepts and algorithms UNC Chapel Hill

Course Work & Grades Homework: 25% (total of 6-9, mostly design & analysis) Quizzes: 5% (very basic materials) Mid-Term Exams: 40% (total of 3, 50 min each, in class) Final Exam: 30% Active Class Participation: up to half a grade bonus Programming Projects: up to 10% bonus UNC Chapel Hill

Examinations Quizzes: throughout the semester Midterms: 3 in class Final: 8am - 11am on May 4th, 2015 “Half” closed book, NO collaboration UNC Chapel Hill

Homework Assignments Due at the beginning of each class on the due date given No late homework will be accepted Lowest score will be dropped Can discuss in group, but must write/formulate solutions alone (failure to explain your solution orally to the instructor = cheat) Be neat, clear, precise, formal – You’ll be graded on correctness, simplicity, elegance & clarity UNC Chapel Hill

Course Project Due on 23:59 Apr 27 (last day of class) No late project report or code will be accepted You are responsible for defining/proposing the course project. You are encouraged to discuss with me and/or submit a proposal earlier than Mar 30. It can be either some implementations of core algorithms we cover (you are responsible for showing the correctness of your code), or some algorithmic study to any open problem. Group work is allowed though contribution of each member must be clarified in the final report. UNC Chapel Hill

Communication Visit instructor during office hours, by appointment, or correspondence All lecture notes and most of handouts are posted at the course website: Major messages are notified by alias + on course website Discussions -- face-to-face in groups, or on Piazza Student grades can be checked with the instructor UNC Chapel Hill

Basic Courtesy Write/print assignments neatly & formally Please do not read newspaper & other materials, or browse web in class When coming to the class late or leaving early, please take an aisle seat quietly Remain quiet, except asking questions or answering questions posed by instructors – no whispers or private conversation THANK YOU!!! UNC Chapel Hill

How to Succeed in this Course Start early on all assignments. DON'T procrastinate. Complete all reading before class. Participate in class. Think in class. Review after each class. Be formal and precise on all problems sets and exams UNC Chapel Hill

Weekly Reading Assignment Chapters 0, 1, 2, 3 and Appendix A (Textbook: CLRS) UNC Chapel Hill

Algorithms A tool for solving a well-specified computational problem Example: sorting input: A sequence of number output: An ordered permutation of input issues: correctness, efficiency, storage, etc. UNC Chapel Hill AlgorithmInput Output

for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Example: Insertion Sort UNC Chapel Hill

Correctness Proofs Proving (beyond “any” doubt) that an algorithm is correct. – Prove that the algorithm produces correct output when it terminates. Partial Correctness. – Prove that the algorithm will necessarily terminate. Total Correctness. Techniques – Proof by Construction. – Proof by Induction. – Proof by Contradiction. UNC Chapel Hill

Loop Invariant Logical expression with the following properties. – Holds true before the first iteration of the loop – Initialization. – If it is true before an iteration of the loop, it remains true before the next iteration – Maintenance. – When the loop terminates, the invariant ― along with the fact that the loop terminated ― gives a useful property that helps show that the loop is correct – Termination. Similar to mathematical induction. UNC Chapel Hill

Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged. for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Example: Insertion Sort UNC Chapel Hill

Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Initialization: j = 2, the invariant trivially holds because A[1] is a sorted array. √ Example: Insertion Sort UNC Chapel Hill

Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Maintenance: the inner while loop finds the position i with A[i] <= key, and shifts A[j-1], A[j-2], …, A[i+1] right by one position. Then key, formerly known as A[j], is placed in position i+1 so that A[i]  A[i+1]  A[i+2]. A[1…j-1] sorted + A[j]  A[1…j] sorted Example: Insertion Sort UNC Chapel Hill

Invariant: at the start of each for loop, A[1…j-1] consists of elements originally in A[1…j-1] but in sorted order; all other elements are unchanged for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key Termination: the loop terminates, when j=n+1. Then the invariant states: “A[1…n] consists of elements originally in A[1…n] but in sorted order.” √ Example: Insertion Sort UNC Chapel Hill

Running time Depends on input (e.g., sorted/reversely) Depends on input size (5 elements vs 500K) – Parameterize in input size (n) Want upper bounds (generally) – Guarantee to the user UNC Chapel Hill for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key for j=2 to length(A) do key=A[j] i=j-1 while i>0 and A[i]>key do A[i+1]=A[i] i-- A[i+1]:=key

Analysis Worst-Case (usually) T(n) = max time on any input of size n Average-Case (sometimes) T(n) = expected time over all inputs of size n (Need assumption of…) Best-Case UNC Chapel Hill

Analyzing Algorithms Assumptions – generic one processor, random access machine – running time (others: memory, communication, etc) Worst Case Running Time: the longest time for any input size of n – upper bound on the running time for any input – in some cases like searching, this is close Average Case Behavior: the expected performance averaged over all possible input – it is generally better than worst case behavior – sometimes it’s roughly as bad as worst case UNC Chapel Hill

A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. LinearSearch(A, key) 1 i  1 2 while i ≤ n and A[i] != key 3 do i++ 4if i  n 5 then return true 6 else return false

A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. LinearSearch(A, key) cost times 1 i  1 c while i ≤ n and A[i] != key c 2 x 3 do i++ c 3 x-1 4if i  n c then return true c else return false c 6 1 x ranges between 1 and n+1. So, the running time ranges between c 1 + c 2 + c 4 + c 5 – best case and c 1 + c 2 (n+1)+ c 3 n + c 4 + c 6 – worst case

A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. Assign a cost of 1 to all statement executions. Now, the running time ranges between = 4 – best case and 1+ (n+1)+ n = 2n+4 – worst case LinearSearch(A, key) cost times 1 i  while i ≤ n and A[i] != key 1 x 3 do i++ 1 x-1 4if i  n then return true else return false 1 1

A Simple Example – Linear Search INPUT: a sequence of n numbers, key to search for. OUTPUT: true if key occurs in the sequence, false otherwise. If we assume that we search for a random item in the list, on an average, Statements 2 and 3 will be executed n/2 times. Running times of other statements are independent of input. Hence, average-case complexity is 1+ n/2+ n/ = n+3 LinearSearch(A, key) cost times 1 i  while i ≤ n and A[i] != key 1 x 3 do i++ 1 x-1 4if i  n then return true else return false 1 1

Correctness Proof of Linear Search Use Loop Invariant for the while loop: LinearSearch(A, key) 1 i  1 2 while i ≤ n and A[i] != key 3 do i++ 4if i  n 5 then return true 6 else return false  If the algm. terminates, then it produces correct result.  Initialization.  Maintenance.  Termination.  Argue that it terminates. UNC Chapel Hill

Correctness Proof of Linear Search Use Loop Invariant for the while loop: – At the start of each iteration of the while loop, the search key is not in the subarray A[1…i-1]. LinearSearch(A, key) 1 i  1 2 while i ≤ n and A[i] != key 3 do i++ 4if i  n 5 then return true 6 else return false  If the algm. terminates, then it produces correct result.  Initialization.  Maintenance.  Termination.  Argue that it terminates. UNC Chapel Hill

Worst-Case time & Order of Growth Depends on computer (software vs. hardware) BIG IDEA - Asymptotic analysis – Ignore machine dependent constants – Look at the growth of T(n) as n -> +inf –  Notation: we can ignore the lower-order terms, since they are relatively insignificant for very large n. We can also ignore leading term’s constant coefficients, since they are not as important for the rate of growth in computational efficiency for very large n. UNC Chapel Hill

Comparisons of Algorithms Sorting – insertion sort:  ( n 2 ) – merge sort:  ( n log n ) For 10 6 numbers, insertion sort takes 5.56 hrs on a supercomputer using machine language and min on a PC using C/C++ with merge sort. Why Order of Growth Matters? Computer speeds double every two years… UNC Chapel Hill

Effect of faster machines UNC Chapel Hill The number of items that can be sorted in one second using an algorithm taking exactly n 2 time as compared to one taking n lg n time, assuming 1 million and 2 million operations per second. Notice that, for the n lg n algorithm, doubling the speed almost doubles the number of items that can be sorted. (Order of growth matters!) Ops/sec:1M2MGain n*n alg n log n alg