CMPT 438 Algorithms.

Slides:



Advertisements
Similar presentations
Introduction to Algorithms 6.046J/18.401J
Advertisements

Introduction to Algorithms 6.046J/18.401J
Introduction to Algorithms 6.046J Lecture 1 Prof. Shafi Goldwasser Prof. Erik Demaine.
Chapter 2. Getting Started. Outline Familiarize you with the to think about the design and analysis of algorithms Familiarize you with the framework to.
I Advanced Algorithms Analysis. What is Algorithm?  A computer algorithm is a detailed step-by-step method for solving a problem by using a computer.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 5.
Analysis of Algorithms CS 477/677 Sorting – Part B Instructor: George Bebis (Chapter 7)
CS 253: Algorithms Chapter 7 Mergesort Quicksort Credit: Dr. George Bebis.
CS421 - Course Information Website Syllabus Schedule The Book:
CS Main Questions Given that the computer is the Great Symbol Manipulator, there are three main questions in the field of computer science: What kinds.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
Introduction CIS 606 Spring The sorting problem Input: A sequence of n numbers 〈 a 1, a 2, …, a n 〉. Output: A permutation (reordering) 〈 a’ 1,
Introduction to Algorithm design and analysis
COMP s1 Computing 2 Complexity
Program Performance & Asymptotic Notations CSE, POSTECH.
Introduction to Algorithms Jiafen Liu Sept
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
Analysis of Algorithms
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Getting Started Introduction to Algorithms Jeff Chastine.
1Computer Sciences Department. Book: Introduction to Algorithms, by: Thomas H. Cormen Charles E. Leiserson Ronald L. Rivest Clifford Stein Electronic:
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
COSC 3101A - Design and Analysis of Algorithms 2 Asymptotic Notations Continued Proof of Correctness: Loop Invariant Designing Algorithms: Divide and Conquer.
Midterm Review 1. Midterm Exam Thursday, October 15 in classroom 75 minutes Exam structure: –TRUE/FALSE questions –short questions on the topics discussed.
Introduction to Algorithms (2 nd edition) by Cormen, Leiserson, Rivest & Stein Chapter 2: Getting Started.
Introduction to Complexity Analysis. Computer Science, Silpakorn University 2 Complexity of Algorithm algorithm คือ ขั้นตอนการคำนวณ ที่ถูกนิยามไว้อย่างชัดเจนโดยจะ.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Lecture # 1 Introduction Analysis of Algorithm by Qamar Abbas Analysis of Algorithms.
Lecture 2 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
Introduction to Algorithms 6.046J/18.401J/SMA5503 Lecture 1 Prof. Charles E. Leiserson.
Lecture # 6 1 Advance Analysis of Algorithms. Divide-and-Conquer Divide the problem into a number of subproblems Similar sub-problems of smaller size.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
Design and Analysis of Algorithms Faculty Name : Ruhi Fatima Course Description This course provides techniques to prove.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 4.
Algorithm Design Techniques, Greedy Method – Knapsack Problem, Job Sequencing, Divide and Conquer Method – Quick Sort, Finding Maximum and Minimum, Dynamic.
Algorithms Sorting – Part 3.
Advanced Sorting.
Lecture 2 Algorithm Analysis
Analysis of Algorithms CS 477/677
Introduction to Algorithms
Design and Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Algorithm Analysis CSE 2011 Winter September 2018.
Chapter 2 Fundamentals of the Analysis of Algorithm Efficiency
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
CS 3343: Analysis of Algorithms
Algorithm Analysis (not included in any exams!)
Analysis of Algorithms CS 477/677
Algorithms + Data Structures = Programs -Niklaus Wirth
Algorithms + Data Structures = Programs -Niklaus Wirth
Ch 2: Getting Started Ming-Te Chi
CS200: Algorithms Analysis
Lecture No 6 Advance Analysis of Institute of Southern Punjab Multan
Chapter 2.
CS200: Algorithms Analysis
CSE 2010: Algorithms and Data Structures Algorithms
Divide and Conquer (Merge Sort)
Algorithm Analysis, Asymptotic notations CISC4080 CIS, Fordham Univ.
Ch. 2: Getting Started.
Analysis of Algorithms
David Kauchak cs161 Summer 2009
Algorithms Sorting.
Divide-and-conquer approach
Divide and Conquer Merge sort and quick sort Binary search
Algorithms and data structures: basic definitions
Presentation transcript:

CMPT 438 Algorithms

Why Study Algorithms? Necessary in any computer programming problem Improve algorithm efficiency: run faster, process more data, do something that would otherwise be impossible Solve problems of significantly large size Technology only improves things by a constant factor Compare algorithms Algorithms as a field of study Learn about a standard set of algorithms New discoveries arise Numerous application areas Learn techniques of algorithm design and analysis

What are Algorithms? An algorithm is a sequence of computational steps that transform the input into the output. An algorithm is also a tool for solving a well- specified computational problem. E.g., sorting problem: <31, 26, 41, 59, 58> is an instance of the sorting problem.

An algorithm is correct if, for every input instance, it halts with the correct output.

Analyzing Algorithms Predict the amount of resources required: memory: how much space is needed? computational time: how fast the algorithm runs? FACT: running time grows with the size of the input Input size (number of elements in the input) Size of an array, # of elements in a matrix, # of bits in the binary representation of the input, vertices and edges in a graph Def: Running time = the number of primitive operations (steps) executed before termination Arithmetic operations (+, -, *), data movement, control, decision making (if, while), comparison

Algorithm Efficiency vs. Speed E.g.: sorting n numbers (n = 106) Friend’s computer = 109 instructions/second Friend’s algorithm = 2n2 instructions (insertion sort) Your computer = 107 instructions/second Your algorithm = 50nlgn instructions (merge sort)

Algorithm Efficiency vs. Speed To sort 100 million numbers: Insertion sort takes more than 23 days Merge sort takes under 4 hours

Typical Running Time Functions 1 (constant running time): Instructions are executed once or a few times logN (logarithmic) A big problem is solved by cutting the original problem in smaller sizes, by a constant fraction at each step N (linear) A small amount of processing is done on each input element N logN A problem is solved by dividing it into smaller problems, solving them independently and combining the solution

Typical Running Time Functions N2 (quadratic) Typical for algorithms that process all pairs of data items (double nested loops) N3 (cubic) Processing of triples of data (triple nested loops) NK (polynomial) 2N (exponential) Few exponential algorithms are appropriate for practical use

Why Faster Algorithms?

Insertion Sort Idea: like sorting a hand of playing cards Remove one card at a time from the table, and insert it into the correct position in the left hand compare it with each of the cards already in the hand, from right to left

Example of insertion sort 5 2 4 6 1 3

INSERTION-SORT (A, n) ⊳ A[1 . . n] for j ←2 to n do key ← A[ j] sorted key n j 1 INSERTION-SORT (A, n) ⊳ A[1 . . n] for j ←2 to n do key ← A[ j] i ← j –1 while i > 0 and A[i] > key do A[i+1] ← A[i] i ← i –1 A[i+1] = key Insertion sort sorts the elements in place.

Analysis of Insertion Sort

Analysis of Insertion Sort

Running time •Parameterize the running time by the size of the input, since short sequences are easier to sort than long ones.

Kinds of analyses Worst-case: • T(n) =maximum time of algorithm on any input of size n. Average-case: • T(n) =expected time of algorithm over all inputs of size n. • Need assumption of statistical distribution of inputs. Best-case: • Cheat with a slow algorithm that works fast on some input.

Machine-independent time What is insertion sort’s worst-case time? •It depends on the speed of our computer BIG IDEA: •Ignore machine-dependent constants. •Look at growth of T(n) as n→∞. “Asymptotic Analysis”

Θ-notation Math: Θ(g(n)) = { f (n): there exist positive constants c1, c2, and n0 such that 0 ≤c1g(n) ≤f (n) ≤c2g(n) for all n≥n0}

Θ-notation Engineering: •Drop low-order terms; ignore leading constants. •Example: 3n3 + 90n2–5n+ 6046 = Θ(n3)

Best Case Analysis

Best Case Analysis The array is already sorted A[i] ≤ key upon the first time the while loop test is run (when i = j -1) tj = 1

Worst Case Analysis

Worst Case Analysis The array is in reverse sorted order Always A[i] > key in while loop test Have to compare key with all elements to the left of the j-th position compare with j-1 elements tj = j

Average Case? All permutations equally likely.

Insertion Sort Summary Advantages Good running time for “almost sorted” arrays θ(n) Disadvantages θ(n2) running time in worst and average case Is insertion sort a fast sorting algorithm? •Moderately so, for small n. •Not at all, for large n.

Worst-Case and Average-Case We usually concentrate on finding only the worst-case running time an upper bound on the running time For some algorithms, the worst case occurs often. E.g., searching when information is not present in the DB The average case is often as bad as the worst case.

Merge Sort

Merge Sort MERGE-SORT A[1 . . n] 1.If n= 1, done. 2.Recursively sort A[ 1 . . .n/2]and A[ [n/2]+1 . . n ] . 3.“Merge” the 2 sorted lists.

Example

Divide-and-Conquer Divide the problem into a number of subproblems Similar sub-problems of smaller size Conquer the sub-problems Solve the sub-problems recursively Sub-problem size small enough to solve the problems in straightforward manner Combine the solutions to the sub-problems Obtain the solution for the original problem

Merge Sort Approach To sort an array A[p . . r]: Divide Conquer Divide the n-element sequence to be sorted into two subsequences of n/2 elements each Conquer Sort the subsequences recursively using merge sort When the size of the sequences is 1 there is nothing more to do Combine Merge the two sorted subsequences

Merge sort

Merge sort

Analyzing merge sort MERGE-SORT A[1 . . n] T(n) 1.If n= 1, done. Θ(1) 2.Recursively sort A[ 1 . . 「 n/2 」] and A[「n/2」+1 . . n ] . 3.“Merge”the 2sorted lists T(n) Θ(1) 2T(n/2) ? Sloppiness: Should be T(「 n/2 」) + T(「n/2」) , but it turns out not to matter asymptotically.

Merging two sorted arrays 20 12 13 11 7 9 2 1

Merging two sorted arrays 20 12 13 11 7 9 2 20 12 13 11 7 9 20 12 13 11 9 20 12 13 11 20 12 13 20 12 13 11 7 9 2 1 1 2 11 12 7 9 Time? In place sort?

Merging two sorted arrays 20 12 13 11 7 9 2 20 12 13 11 7 9 20 12 13 11 9 20 12 13 11 20 12 13 20 12 13 11 7 9 2 1 1 2 11 12 7 9 Time = Θ(n) to merge a total of n elements (linear time).

Analyzing Divide and Conquer Algorithms T(n) = aT(n/b) + D(n) + C(n) The recurrence is based on the three steps of the paradigm: T(n) – running time on a problem of size n Divide the problem into a subproblems, each of size n/b: takes D(n) Conquer (solve) the subproblems: takes aT(n/b) Combine the solutions: takes C(n)

MERGE – SORT Running Time T(n) = 2T(n/2) + θ(n) if n > 1 Divide: compute q as the average of p and r: D(n) = θ(1) Conquer: recursively solve 2 subproblems, each of size n/2 -> 2T (n/2) Combine: MERGE on an n-element subarray takes θ(n) time C(n) = θ(n)

Recurrence for merge sort Θ(1) if n= 1; 2T(n/2)+ Θ(n) if n> 1. T(n) =

Recursion tree Solve T(n) = 2T(n/2) + cn, where c > 0 is constant.

Conclusions • Θ(n lg n) is better than Θ(n2). • Therefore, merge sort asymptotically beats insertion sort in the worst case. Disadvantage Requires extra space Θ (n)

Divide-and-Conquer Example: Binary Search Find an element in a sorted array: 1. Divide: Check middle element. 2. Conquer: Recursively search 1 subarray. 3. Combine: Trivial. A[8] = {1, 2, 3, 4, 5, 7, 9, 11} Find 7

For an ordered array A, finds if x is in the array A[lo…hi]

Analysis of Binary Search ?

Divide-and-Conquer Example: Powering a Number ? ? ? ?

Homework 1 Quiz