CS421 - Course Information Website Syllabus Schedule The Book:

Slides:



Advertisements
Similar presentations
2. Getting Started Heejin Park College of Information and Communications Hanyang University.
Advertisements

Introduction to Algorithms 6.046J/18.401J
Introduction to Algorithms 6.046J/18.401J
Introduction to Algorithms 6.046J Lecture 1 Prof. Shafi Goldwasser Prof. Erik Demaine.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
Chapter 2. Getting Started. Outline Familiarize you with the to think about the design and analysis of algorithms Familiarize you with the framework to.
A Basic Study on the Algorithm Analysis Chapter 2. Getting Started 한양대학교 정보보호 및 알고리즘 연구실 이재준 담당교수님 : 박희진 교수님 1.
2. Getting started Hsu, Lih-Hsing. Computer Theory Lab. Chapter 2P Insertion sort Example: Sorting problem Input: A sequence of n numbers Output:
ALGORITHMS Introduction. Definition Algorithm: Any well-defined computational procedure that takes some value or set of values as input and produces some.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Spring 2015 Lecture 5: QuickSort & Selection
Quicksort Many of the slides are from Prof. Plaisted’s resources at University of North Carolina at Chapel Hill.
What is an Algorithm? (And how do we analyze one?)
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS Section 600 CS Section 002 Dr. Angela Guercio Spring 2010.
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
Sorting. Input: A sequence of n numbers a 1, …, a n Output: A reordering a 1 ’, …, a n ’, such that a 1 ’ < … < a n ’
Lecture 2: Divide and Conquer I: Merge-Sort and Master Theorem Shang-Hua Teng.
CS3381 Des & Anal of Alg ( SemA) City Univ of HK / Dept of CS / Helena Wong 2. Analysis of Algorithms - 1 Analysis.
Quicksort CIS 606 Spring Quicksort Worst-case running time: Θ(n 2 ). Expected running time: Θ(n lg n). Constants hidden in Θ(n lg n) are small.
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Analysis of Algorithms CS 477/677
What is an Algorithm? (And how do we analyze one?) COMP 122, Spring 04.
Analysis of Algorithm.
1 QuickSort Worst time:  (n 2 ) Expected time:  (nlgn) – Constants in the expected time are small Sorts in place.
Introduction CIS 606 Spring The sorting problem Input: A sequence of n numbers 〈 a 1, a 2, …, a n 〉. Output: A permutation (reordering) 〈 a’ 1,
Unit 1. Sorting and Divide and Conquer. Lecture 1 Introduction to Algorithm and Sorting.
Introduction to Algorithm design and analysis
HOW TO SOLVE IT? Algorithms. An Algorithm An algorithm is any well-defined (computational) procedure that takes some value, or set of values, as input.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Introduction to Algorithms Jiafen Liu Sept
Algorithm Correctness A correct algorithm is one in which every valid input instance produces the correct output. The correctness must be proved mathematically.
10/13/20151 CS 3343: Analysis of Algorithms Lecture 9: Review for midterm 1 Analysis of quick sort.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Algorithms Lecture 1. Introduction The methods of algorithm design form one of the core practical technologies of computer science. The main aim of this.
Introduction to Algorithms Lecture 1. Introduction The methods of algorithm design form one of the core practical technologies of computer science. The.
10/14/ Algorithms1 Algorithms - Ch2 - Sorting.
ECOE 456/556: Algorithms and Computational Complexity Lecture 1 Serdar Taşıran.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
File Organization and Processing Week 13 Divide and Conquer.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
A Lecture /24/2015 COSC3101A: Design and Analysis of Algorithms Tianying Ji Lecture 1.
ALGORITHMS THIRD YEAR BANHA UNIVERSITY FACULTY OF COMPUTERS AND INFORMATIC Lecture three Dr. Hamdy M. Mousa.
Getting Started Introduction to Algorithms Jeff Chastine.
1Computer Sciences Department. Book: Introduction to Algorithms, by: Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest Clifford Stein Electronic:
QuickSort (Ch. 7) Like Merge-Sort, based on the three-step process of divide- and-conquer. Input: An array A[1…n] of comparable elements, the starting.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
Introduction to Complexity Analysis. Computer Science, Silpakorn University 2 Complexity of Algorithm algorithm คือ ขั้นตอนการคำนวณ ที่ถูกนิยามไว้อย่างชัดเจนโดยจะ.
Algorithms A well-defined computational procedure that takes some value as input and produces some value as output. (Also, a sequence of computational.
2IS80 Fundamentals of Informatics Fall 2015 Lecture 6: Sorting and Searching.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
CSC317 1 So far so good, but can we do better? Yes, cheaper by halves... orkbook/cheaperbyhalf.html.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting Input: sequence of numbers Output: a sorted sequence.
Lecture 2 Algorithm Analysis
CMPT 438 Algorithms.
Unit 1. Sorting and Divide and Conquer
CS 3343: Analysis of Algorithms
CS 583 Fall 2006 Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Ch 2: Getting Started Ming-Te Chi
Ch. 2: Getting Started.
Introduction To Algorithms
Presentation transcript:

CS421 - Course Information Website Syllabus Schedule The Book:

What is an algorithm? Informally, an algorithm is any well-defined computational procedure that takes some value(s) as input and produces some value(s) as output. Goals for an algorithm: 1.Correct 2.Terminates 3.Efficient

What is CS421 about? We will engage in the theoretical study of the design and analysis of computer algorithms. –Analysis: predict the cost of an algorithm in terms of performance and resources used –Design: design algorithms which minimize such costs

Machine Model Assumptions.. Random Access Machine (RAM) Model: 1.Any memory cell can be accessed in 1 step. 2.Memory is not limited (unbounded). 3.Arbitrarily large integers can be stored in each memory cell. 4.Operations are executed sequentially. 5.Operators include: primitive arithmetic (+, -, *, /, modulo, etc..) logic (if..then, and, or, etc..) comparators (, =, etc..) and function calls. 6.Each operation has a unit cost of 1.

An Example.. Sorting Input: A sequence of n numbers,  a 1, a 2, …, a n  Output: A permutation (reordering),  a' 1, a' 2, …, a' n , of the input sequence such that a' 1  a' 2  …  a' n. Example: Input: Output:

Insertion Sort Pseudocode: I NSERTION -S ORT (A) 1for j ← 2 to length[A] 2dokey ← A[ j] 3i ← j – 1 4while i > 0 and A[i] > key 5doA[i+1] ← A[i] 6i ← i – 1 7A[i+1] = key

Example of insertion sort

824936

end

How “Good” is Insertion Sort? Recall our goals for an algorithm: 1.Correct 2.Terminates 3.Efficient

Correctness Informally: At each step the “current card”, j, is inserted into an already sorted subarray A[1.. j-1]. More formally: The loop invariant (a condition that does not change if correct) is that at the start of each for-loop, the subarray A[1.. j-1] consists of the elements in A[1.. j-1] but in sorted order.

Correctness These properties must hold for a loop invariant: Initialization: It is true prior to the first iteration. Maintenance: It is true before an iteration of the loop and remains true before the next iteration. Termination: When the loop terminates, the invariant yields a useful property which helps show the algorithm is correct.

Correctness In the case of insertion sort: Initialization: If j=2 at initialization, A[1..j-1] consists of a single element which is by definition sorted. Maintenance: Informally, the for-loop works by moving A[j-1], A[j-2], etc. one position to the right until the correct position for A[j] is found. Since A[1..j] is now sorted, when j is incremented, A[1..j-1] is sorted. Termination: The loop terminates when j=n+1. From the Maintenance property we know A[1..j-1] is now sorted. That is A[1..n] is now sorted. Hence the algorithm is correct.

Termination As we will see later in the semester determining if an arbitrary program terminates or halts is undecidable for Turing machines. In the specific instance of Insertion-Sort, it is easy to see that the two loops iterate at most n times each and thus the algorithm does not run indefinitely and does terminate.

Efficiency Running time is a measure of how many steps or primitive operations were performed. We have already stated that in the RAM machine model each operator has cost 1, but lets assume instead each operator has cost c i, where i is a line in our algorithm.

Kinds of Analysis Worst-case: T(n) = maximum time of algorithm on any input of size n. Average-case: T(n) = expected time of algorithm over all inputs of size n. –Requires assumption of statistical distribution of inputs. Best-case: T(n) = minimum time of algorithm on any input of size n. –Problematic because a generally slow algorithm may works fast on some input.

Running Time Lets analyze our algorithm once more.. I NSERTION -S ORT (A)costtimes 1for j ← 2 to length[A]c 1 n 2dokey ← A[ j]c 2 n-1 3i ← j – 1c 3 n-1 4while i > 0 and A[i] > keyc 4 ∑ j=2..n t j 5doA[i+1] ← A[i]c 5 ∑ j=2..n (t j -1) 6i ← i – 1c 6 ∑ j=2..n (t j -1) 7A[i+1] = keyc 7 n-1

Best Case T(n) = (c1+c2+c3+c4+c7)n – (c2+c3+c4+c7) I NSERTION -S ORT (A)costtimes 1for j ← 2 to length[A]c 1 n 2dokey ← A[ j]c 2 n-1 3i ← j – 1c 3 n-1 4while i > 0 and A[i] > keyc 4 ∑ j=2..n t j 5doA[i+1] ← A[i]c 5 ∑ j=2..n (t j -1) 6i ← i – 1c 6 ∑ j=2..n (t j -1) 7A[i+1] = keyc 7 n-1

Worst Case T(n) = (c 4 +c 5 +c 6 )n 2 /2 + (c 1 +c 2 +c 3 +c 4 /2-c 5 /2- c 6 /2c 7 )n – (c 2 + c 3 + c 4 + c 7 ) I NSERTION -S ORT (A)costtimes 1for j ← 2 to length[A]c 1 n 2dokey ← A[ j]c 2 n-1 3i ← j – 1c 3 n-1 4while i > 0 and A[i] > keyc 4 ∑ j=2..n t j 5doA[i+1] ← A[i]c 5 ∑ j=2..n (t j -1) 6i ← i – 1c 6 ∑ j=2..n (t j -1) 7A[i+1] = keyc 7 n-1

Average Case t j = j/2.. Only out of order half the time.. T(n) = (c 4 +c 5 +c 6 )n 2 /4 + (c 1 +c 2 +c 3 +c 4 /4-c 5 /4- c 6 /4c 7 )n – (c 2 + c 3 + c 4 + c 7 ) I NSERTION -S ORT (A)costtimes 1for j ← 2 to length[A]c 1 n 2dokey ← A[ j]c 2 n-1 3i ← j – 1c 3 n-1 4while i > 0 and A[i] > keyc 4 ∑ j=2..n t j 5doA[i+1] ← A[i]c 5 ∑ j=2..n (t j -1) 6i ← i – 1c 6 ∑ j=2..n (t j -1) 7A[i+1] = keyc 7 n-1

Machine Independent Analysis As the size of the input becomes large, the constants c i don’t matter as much as the exponents and log factors. The constants also make machine independent analysis impossible. –Ignore the constants. –Examine growth of T(n) as n → ∞. –Asymptotic Analysis

Order of Growth The rate of growth is of primary interest, so we consider only the leading term and ignore all constants (e.g. n^2) Thus, the worst case running time of Insertion Sort is Θ(n 2 ). Quadratic time. We will define this more precisely later.

Design Approach: Divide and Conquer Divide the problem into a number of subproblems. Conquer the subproblems recursively. Combine the subproblem solutions into the solution for the original problem. Recursion: when an algorithm calls itself.

Merge Sort Divide: Divide an n-element array into two subsequences of n/2 elements each. Conquer: Sort the two subsequences recursively with merge sort. Combine: Merge the two sorted arrays to produce the sorted sequence. Special Case: If the sequence has only one element the recursion “bottoms out” as the sequence is sorted by definition.

Merge Sort M ERGE -S ORT ( A[1.. n] ) 1.If n = 1, return A. 2.L = A[ 1..  n/2  ] 3.R = A[  n/2  +1.. n ] 4.L = Merge-Sort(L) 5.R = Merge-Sort(R) 6.Return Merge(L, R)

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays

Merging two sorted arrays Time =  (n) to merge a total of n elements (linear time).

Analyzing merge sort M ERGE -S ORT A[1.. n] 1.If n = 1, done. 2.Recursively sort A[ 1..  n/2  ] and A[  n/2  +1.. n ]. 3.“Merge” the 2 sorted lists T(n)  (1) 2T(n/2)  (n) Sloppiness: Should be T(  n/2  ) + T(  n/2  ), but it turns out not to matter asymptotically.

Recurrence for merge sort T(n) =  (1) if n = 1; 2T(n/2) +  (n) if n > 1. We shall usually omit stating the base case when T(n) =  (1) for sufficiently small n, but only when it has no effect on the asymptotic solution to the recurrence.

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n)T(n)

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. T(n/2) cn

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn T(n/4) cn/2

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) …

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n cn

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n cn

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n cn …

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n cn #leaves = n (n)(n) …

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant. cn cn/4 cn/2  (1) … h = lg n cn #leaves = n (n)(n) Total  (n lg n) …

Conclusions  (n lg n) grows more slowly than  (n 2 ). Therefore, merge sort asymptotically beats insertion sort in the worst case.