CSC 380: Design and Analysis of Algorithms

Slides:



Advertisements
Similar presentations
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Advertisements

Introduction to Analysis of Algorithms
Cmpt-225 Algorithm Efficiency.
1 TCSS 342, Winter 2005 Lecture Notes Course Overview, Review of Math Concepts, Algorithm Analysis and Big-Oh Notation Weiss book, Chapter 5, pp
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
CSE 373 Data Structures and Algorithms Lecture 4: Asymptotic Analysis II / Math Review.
TCSS 342 Lecture Notes Course Overview, Review of Math Concepts,
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Algorithm Efficiency CS 110: Data Structures and Algorithms First Semester,
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Algorithm Analysis An algorithm is a clearly specified set of simple instructions to be followed to solve a problem. Three questions for algorithm analysis.
Analysis of Algorithms
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Asymptotic Analysis-Ch. 3
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
CSC – 332 Data Structures Generics Analysis of Algorithms Dr. Curry Guinn.
Introduction to Analysis of Algorithms CS342 S2004.
Algorithm Analysis Part of slides are borrowed from UST.
CSE 373: Data Structures and Algorithms
Algorithm Complexity L. Grewe 1. Algorithm Efficiency There are often many approaches (algorithms) to solve a problem. How do we choose between them?
A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 2 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
CSE 373: Data Structures and Algorithms Lecture 4: Math Review/Asymptotic Analysis II 1.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 4.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Complexity of Algorithms Fundamental Data Structures and Algorithms Ananda Guna January 13, 2005.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
Analysis of Algorithms
Chapter 2 Algorithm Analysis
Analysis of algorithms
COMP9024: Data Structures and Algorithms
COMP9024: Data Structures and Algorithms
Introduction to Algorithms
Analysis of algorithms
Analysis of Algorithms
Analysis of Algorithms
CS 3343: Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Analysis of algorithms
Algorithm Analysis (not included in any exams!)
CSC 413/513: Intro to Algorithms
Analysis of Algorithms
Introduction to Algorithms Analysis
Analysis of Algorithms
TCSS 342, Winter 2006 Lecture Notes
Chapter 2: Fundamentals of the Analysis of Algorithm Efficiency
Analysis of Algorithms
CS 201 Fundamental Structures of Computer Science
Chapter 2.
Analysis of Algorithms
CSE 373 Data Structures and Algorithms
Performance Evaluation
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
At the end of this session, learner will be able to:
Analysis of algorithms
CSC 380: Design and Analysis of Algorithms
CSC 380: Design and Analysis of Algorithms
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
CSE 373: Data Structures and Algorithms
Estimating Algorithm Performance
Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 . Lingo: “T(N) grows no slower than.
Algorithm Course Dr. Aref Rashad
Analysis of Algorithms
Presentation transcript:

CSC 380: Design and Analysis of Algorithms Dr. Curry Guinn

Quick Info Dr. Curry Guinn CIS 2015 guinnc@uncw.edu www.uncw.edu/people/guinnc 962-7937 Office Hours: MTF: 10:00am-11:00m and by appointment

Today Analysis of Algorithms The model Relative rates of growth Big oh and its kin Homework 1 due Sunday, Jan 27 Canvas Quiz 3 due by Tuesday, 11:59pm, Jan 29 No late Canvas quizzes No make up quizzes

RAM model Typically use a simple model for basic operation costs RAM (Random Access Machine) model RAM model has all the basic operations: +, -, *, / , =, comparisons fixed sized integers (e.g., 32-bit) infinite memory All basic operations take exactly one time unit (one CPU instruction) to execute analysis= determining run-time efficiency. model = estimate, not meant to represent everything in real-world

Critique of the model Strengths: Weaknesses: simple easier to prove things about the model than the real machine can estimate algorithm behavior on any hardware/software Weaknesses: not all operations take the same amount of time in a real machine does not account for page faults, disk accesses, limited memory, floating point math, etc model = approximation of real world. can predict run-time of algorithm on machine.

Relative rates of growth Most algorithms' runtime can be expressed as a function of the input size N Rate of growth: measure of how quickly the graph of a function rises Goal: distinguish between fast- and slow-growing functions We only care about very large input sizes (for small sizes, most any algorithm is fast enough) Motivation: we usually care only about algorithm performance when there are large number of inputs. We usually don’t care about small changes in run-time performance. (inaccuracy of estimates make small changes less relevant). Consider algorithms with slow growth rate better than those with fast growth rates.

Growth rate example Consider these graphs of functions. Perhaps each one represents an algorithm: n3 + 2n2 100n2 + 1000 Which grows faster?

Growth rate example How about now?

“The fundamental law of computer science: As machines become more powerful, the efficiency of algorithms grows more important, not less.” — Nick Trefethen An algorithm (or function or technique …) that works well when used with large problems & large systems is said to be scalable. Or “it scales well”.

Big O Definition: T(N) = O(f(N)) if there exist positive constants c , n0 such that: T(N)  c · f(N) for all N  n0 Idea: We are concerned with how the function grows when N is large. We are not picky about constant factors: coarse distinctions among functions Lingo: "T(N) grows no faster than f(N)."

Big O  c , n0 > 0 such that f(N)  c g(N) when N  n0 f(N) grows no faster than g(N) for “large” N

Preferred big-O usage pick tightest bound. If f(N) = 5N, then: f(N) = O(N5) f(N) = O(N3) f(N) = O(N log N) f(N) = O(N)  preferred ignore constant factors and low order terms T(N) = O(N), not T(N) = O(5N) T(N) = O(N3), not T(N) = O(N3 + N2 + N log N) remove non-base-2 logarithms f(N) = O(N log6 N) f(N) = O(N log N)  preferred

Big-O of selected functions

Big Omega, Theta Defn: T(N) = (g(N)) if there are positive constants c and n0 such that T(N)  c g(N) for all N  n0 Lingo: "T(N) grows no slower than g(N)." Defn: T(N) = (g(N)) if and only if T(N) = O(g(N)) and T(N) = (g(N)). Big-O, Omega, and Theta establish a relative ordering among all functions of N

Intuition about the notations O (Big-O)   (Big-Omega)   (Theta) = o (little-O) <

Big-Omega f(N) grows no slower than g(N) for “large” N  c , n0 > 0 such that f(N)  c g(N) when N  n0 f(N) grows no slower than g(N) for “large” N

Big Theta: f(N) = (g(N)) the growth rate of f(N) is the same as the growth rate of g(N)

An O(1) algorithm is constant time. The running time of such an algorithm is essentially independent of the input. Such algorithms are rare, since they cannot even read all of their input. An O(logbn) [for some b] algorithm is logarithmic time. We do not care what b is. An O(n) algorithm is linear time. Such algorithms are not rare. This is as fast as an algorithm can be and still read all of its input. An O(n logbn) [for some b] algorithm is log-linear time. This is about as slow as an algorithm can be and still be truly useful (scalable). An O(n2) algorithm is quadratic time. These are usually too slow. An O(bn) [for some b] algorithm is exponential time. These algorithms are much too slow to be useful.

Hammerin’ the terminolgy T(N) = O(f(N)) f(N) is an upper bound on T(N) T(N) grows no faster than f(N) T(N) = (g(N)) g(N) is a lower bound on T(N) T(N) grows at least as fast as g(N) T(N) = o(h(N)) (little-O) T(N) grows strictly slower than h(N)

Notations Asymptotically less than or equal to O (Big-O) Asymptotically greater than or equal to  (Big-Omega) Asymptotically equal to  (Big-Theta) Asymptotically strictly less o (Little-O)

Facts about big-O If T(N) is a polynomial of degree k, then: T(N) = (Nk) example: 17n3 + 2n2 + 4n + 1 = (n3)

Hierarchy of Big-O Functions, ranked in increasing order of growth: 1 log n n n log n n2 n2 log n n3 ... 2n n! nn

Various growth rates

Techniques for Determining Which Grows Faster Evaluate: limit is Big-Oh relation f(N) = o(g(N)) c  0 f(N) = (g(N))  g(N) = o(f(N)) no limit no relation

Techniques, cont'd L'Hôpital's rule: If and , then example: f(N) = N, g(N) = log N Use L'Hôpital's rule f'(N) = 1, g'(N) = 1/N  g(N) = o(f(N))

Program loop runtimes for (int i = 0; i < n; i += c) // O(n) statement(s); Adding to the loop counter means that the loop runtime grows linearly when compared to its maximum value n. Loop executes its body exactly n / c times. for j in range(0, n, c): // O(n) statement(s) Or if myList contains n elements for item in myList: // O(n)

More loop runtimes Nesting loops multiplies their runtimes. for (int i = 0; i < n; i += c) { //O(n2) for (int j = 0; j < n; i += c) { statement; } } for j in range(0, n, c): // O(n2) for k in range(0, n, c): statement(s) Or if myList contains n elements for item in myList: // O(n2) for item in myList:

The loop maximum is n2, so the runtime is quadratic. for (int i = 0; i < n * n; i += c) // O(n2) statement(s); The loop maximum is n2, so the runtime is quadratic. Loop executes its body exactly (n2 / c) times. for j in range(0, n*n, c): // O(n2) statement(s)

for (int i = 1; i <= n; i *= c) // O(log n) statement(s); Multiplying the loop counter means that the maximum value n must grow exponentially to linearly increase the loop runtime; therefore, it is logarithmic. Loop executes its body exactly logc n times. j = 1 while j <= n: // O(log n) statement(s) j *= c

for (int i = n; i >= 1; i /= c) // O(log n) statement(s); Multiplying the loop counter means that the maximum value n must grow exponentially to linearly increase the loop runtime; therefore, it is logarithmic. Loop executes its body exactly logc n times. while n >= 1: // O(log n) statement(s) n /= c

Loops in sequence add together their runtimes, which means the loop set with the larger runtime dominates. for (int i = 0; i < n; i += c) { // O(n) statement; } // O(nlog n) for (int i = 0; i < n; i += c) { for (int j = 0; j < n; i *= c) { } }

Types of runtime analysis Express the running time as f(N), where N is the size of the input worst case: your enemy gets to pick the input average case: need to assume a probability distribution on the inputs However, even with input size N, cost of an algorithm could vary on different input.

Example: Sequential search Worst case Best case Average case A. Levitin “Introduction to the Design & Analysis of Algorithms,” 3rd ed., Ch. 2 ©2012 Pearson Education, Inc. Upper Saddle River, NJ. All Rights Reserved.

Some rules When considering the growth rate of a function using Big-O Ignore the lower order terms and the coefficients of the highest-order term No need to specify the base of logarithm Changing the base from one constant to another changes the value of the logarithm by only a constant factor If T1(N) = O(f(N) and T2(N) = O(g(N)), then T1(N) + T2(N) = max(O(f(N)), O(g(N))), T1(N) * T2(N) = O(f(N) * g(N))

For Next Class, Friday Homework 1 is due Sunday, 11:59pm, Jan 27, 11:59pm Canvas Quiz 3 due by Tuesday, 11:59pm, Jan 29 No late Canvas quizzes No make up quizzes