Chapter 2 Complexity.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 2 Some of the sides are exported from different sources.
Chapter 1 – Basic Concepts
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Introduction to Analysis of Algorithms
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
Cmpt-225 Algorithm Efficiency.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
CHAPTER 10 Recursion. 2 Recursive Thinking Recursion is a programming technique in which a method can call itself to solve a problem A recursive definition.
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Mathematics Review and Asymptotic Notation
Analysis of Algorithms
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 5 Algorithms (2) Introduction to CS 1 st Semester, 2015 Sanghyun Park.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
Algorithm Analysis (Big O)
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Lecture 3COMPSCI.220.S1.T Running Time: Estimation Rules Running time is proportional to the most significant term in T(n) Once a problem size.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Introduction to Analysis of Algorithms
Analysis of Algorithms
Analysis of Algorithms
Algorithm Analysis.
GC 211:Data Structures Week 2: Algorithm Analysis Tools
Introduction to complexity
Analysis of Algorithms
Introduction to Algorithms
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
COMP 53 – Week Seven Big O Sorting.
Chapter 2 Complexity.
Growth of functions CSC317.
Teach A level Computing: Algorithms and Data Structures
Big-Oh and Execution Time: A Review
Algorithm Analysis (not included in any exams!)
Building Java Programs
Algorithm design and Analysis
Complexity Present sorting methods. Binary search. Other measures.
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
Introduction to Algorithms Analysis
Data Structures Review Session
CS 201 Fundamental Structures of Computer Science
Programming and Data Structure
Analysis of Algorithms
Chapter 2.
Programming and Data Structure
PAC Intro to “big o” Lists Professor: Evan Korth New York University
Building Java Programs
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
Complexity Analysis Text is mainly from Chapter 2 by Drozdek.
At the end of this session, learner will be able to:
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
David Kauchak cs161 Summer 2009
Estimating Algorithm Performance
Analysis of Algorithms
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

Chapter 2 Complexity

Program Performance Typically the performance of a program is dependent on the size of the input data Performance: memory and time required Performance analysis: analytical Performance measurement: experimental Space complexity (usually less important): Space may be limited May use to determine largest problem size we can solve Time complexity: Real time constraints May need to interact with user Components of space complexity: At seats, draw a picture of how recursion works – space-wise. Instruction space – needed to store the compiled version Data space – variables and constants Environment stack space – for recursion and return values

Log Review log232= log327= log51= logxx= a logax= For a balanced binary tree with 63 nodes, how many levels are there? If I have an array of size 246, how many times can I split the array in half? Log values: log232= log327= log51= logxx= a logax= Components of time complexity: Amount of time spent in each operation - difficult to measure Estimate of number of times a key operation is performed In small programs, actual elapsed time is difficult to measure (for accuracy)

Operation Counts Consider two sorts you may know about: Bubble sort and quicksort How do they work? For bubble sort, if you charge $10 to sort n items, how much should you charge to sort twice as many? We say the runtime is proportional to n2

Asymptotics What is an asymptote? Asymptotics – study of functions of n as n gets large (without bound) If the running time of an algorithm is constant, if we double n, the time doesn’t change. If the running time of an algorithm is proportional to n, when we double n, we double the running time.

Asymptotics What if something is proportional to n2 (running time is cn2)? When n doubles, how does the time differ?

Asymptotics What if something is proportional to n2 (running time is cn2)? When n doubles, how does the time differ? Original time was cn2 When double: c(2n)2 = c(4n2) = 4(cn2) Suppose something is proportional to n3 (running time is cn3). When n doubles, how does the time differ?

Asymptotics If the running time is proportional to log n ( c log n), when we double n what happens to the run time?

Asymptotics Original time c log n c log 2n = c (log 2 + log n) = c log2 + c log n = c +c log n ( as log 2 = 1) doubling n only increased the time by c.

Asymptotics - examples List examples of something being proportional to something else: Earnings are proportional to hours worked. Grades (may seem like) are proportional to log of work or even constant For some, desirability is proportional to cost2 . Area of pentagon is proportional to base2. Volume of cube is proportional to base3.

DEFINITION: O Notation (Big Oh) We want to give an upper bound on the amount of time it takes the best algorithm to solve a problem. Definition: give a constant c and n0 such that t(n) <= cf(n) whenever n>= n0. The growth rate of the time to execute, t(n), is less than or equal to that of the function describing the effort, f(n) Pick a single c which must work for all n. Termed complexity

From csilm site

Note role of n0 and c

Complexity Classes O(1) (constant), Determining odd/even O( log n) (logarithmic), Binary search O(n) (linear), Finding item in list O(n log n), Quicksort O(n2) (quadratic), All shake hands with everyone in a group. O(n3) (cubic), Matrix multiply O(2n) (exponential) Knapsack Intractable O(n!) (factorial). Find all paths

Factorial time – O(n!) Problems that need factorial time – O(n!) – are even worse than O(kn) problems. The algorithm to solve the Travelling Salesman Problem (TSP) using the brute force method (look at all possible routes to find the shortest) is O(n!).

Factorial time – O(n!) It is possible to solve the TSP in Exponential time, rather than Factorial time using a different method called Dynamic programming. But this still does not give solutions to the problem in a reasonable amount of time for any except the smallest versions of the problem.

Measuring Complexity Additive Property – If two statements follow one another, the complexity of the two of them is the larger complexity. If/Else If/Else: The complexity is the running time of evaluating cond plus the larger of the complexities of S1 and S2. if ( cond ) S1 else S2 Multiplicative Property: For Loops: the complexity of a for loop is at most the complexity of the statements inside the for loop times the number of iterations which are executed. However, if each iteration is different in execution length (due to conditional execution), it is more accurate to sum the individual lengths rather than just multiply.  

Examples for (int i = 0; i < n; i++) for (int j = 0; j < n; j++) x++;

Examples for (int i = 0; i < m; i++) for (int j = 0; j < n; j++) x++;

Example for (i = 1; i < n; i++) for (j = i; j < n; j++) x++;

Example for (i = 1; i < n; i++) for (j = i; j < n; j++) x++; n2/2 = O(n2)

Arithmetic Progression Example: S= 3 + 5 + 7 + 9 + … (2n-1) + (2n+1) Writing the same sum backwards: S= (2n+1) + (2n-1) + … + 9 + 7 + 5 + 3 If we add the two S’s together, we get 2S = (2n+4) + (2n+4) + … (2n+4) + (2n+4) 2S = n(2n+4) S= n(n+2)

Example while (n>1) { for ( int j = 0; j < n; j++) x++; n /= 2; } n + n/2 + n/4 + … 1 Geometric progression What is the total?

Example while (n>1) { for ( int j = 0; j < n; j++) x++ n/=2; } n + n/2 + n/4 + … 1 Geometric progression What is the total?

Geometric Progression Example: S = 2 + 4 + 8 + … + 2n Multiply both sides by the base 2S = 4 + 8 + 16+ … + 2n +2n+1 Subtracting the first from the second, S = 2n+1 - 2

Recursive Examples void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); }

Recursive Examples void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); } O(n log n)

Recursive Examples void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); }

Recursive Examples void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); }

Recursive Examples void doit(int n) { if (n==1) return; x++; doit(n/2); }

Recursive Examples void doit(int n) { if (n==1) return; x++; doit(n/2); }

Recursive Examples void doit(int n) { if (n<=1) return; x++; doit(n/2); }

Recursive Examples void doit(int n) { if (n<=1) return; x++; doit(n/2); } Work done in every call is constant, but number of calls doubles at each level.

What is the complexity? for (i=0; i < n; i++) a[i] = 0; for (j=0; j < n; j++) a[i] += a[j] + i + j;

What is the complexity? if (zeroOut) for (i=0; i < n; i++) a[i] = 0; else for (j=0; j < n; j++) a[i] += a[j] + i + j;

With complexity, constants are included in the big O. Which of the following are simplified complexities? O(n2 + n) O(max(n,m)) O(n2/2) o(n log n)

Other bounds Name Expression Growth Rate Similar to Big-Oh Growth of T(n) is  growth of F(n) Less than or Equal (Upper bound) Big Omega Growth of T(n) is  growth of F(n) Greater than (lower bound) Big-Theta Growth of T(n) is = growth of F(n) Equal to (tight bound) Little-Oh Growth of T(n) is < growth of F(n) T(n) is negligible compared to F(n) Why don’t we just use Big Theta? Isn’t a tight bound best? Def : A lower bound of a problem is the least time complexity required for any algorithm which can be used to solve this problem (for any input). It is difficult to prove that no better algorithm can be found.

Announcements We are in chapter 2 of the text. Do read it. Check canvas for quizzes. There are several due soon. These can be done with others. They can be repeated. Make sure you can run the applets. You may need to adjust the security of Java that your browser allows.

Programming thoughts A student asked, “Why don’t we care about factor of two improvements? Answer: WE DEFINITELY DO. But, you don’t need a class to tell you how to make those improvements.

Look at style guidelines Makes a huge difference in reading your code or someone else’s. You will write better code. You will write code faster. It will help you (and the tutors) debug better.

Short-Term Memory Quiz (30 seconds to look at list) How many can you recall? eggs drawing rock apple focus mission favor brain flag trial partner house life chair ice

Use abstractions Don’t worry about everything at once. You are not as efficient (or as effective) when you try. For example, ignore the functioning of the queue when writing your “solve” routine. For example, write the board class independently so you can think in higher level concepts (like move up if you can and if you want to) rather than lower (move up)

Even better yet Idea Any subsequence which is negative does not begin the maximum subsequence (as you would be better off removing it). sum=0; bestSum = 0; for (int beg=0,end=0; end < n; end++) { sum += a[end]; if (sum >bestSum) { bestSum = sum; low = beg; high = end;} else if (sum < 0) {beg=end+1; sum=0;} } What is complexity now?

Recursive Examples void doit(int n) { if (n<=1) return; x++; doit(n-1); }

Recursive Examples void doit(int n) { if (n<=1) return; x++; doit(n-1); } 1+2+4+8+…2n

When recursive problems are regular in nature, we can use a Formula Approach Theorem: Assume T(n) = a(T(n/b))+O(nk) is the time for the function. If a > bk, the complexity is O(n log b a). If a = bk, the complexity is O( nk log n). If a < bk, the complexity is O( nk) . a is number of recursive calls at one level b is how size is divided between calls k is amount of work as an exponent (# of for loops)

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); }

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); } a=2 b=2 k=1 O(n1log n)

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); }

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; for (int i=0; i < n; i++) x = x + 1; doit(n/2); } a=1 b=2 k = 1 a < bk O(n)

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; x++; doit(n/2); }

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; x++; doit(n/2); } a=1 b=2 k=0 O(n0 log n) = O(log n)

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; x++; doit(n/2); }

Recursive Examples Identify a,b,k and give complexity void doit(int n) { if (n<=1) return; x++; doit(n/2); } a=2 b=2 k=0 a > bk If a > bk, the complexity is O(n log b a). O(nlog 22) = O(n)

Study the table below which compares various complexities Study the table below which compares various complexities. Note that even for small n (1000), time is measured in years for 2n. Complexity 2n is termed intractable. Log n n n log n n2 n3 2n 1 2 4 8 16 64 3 24 512 256 4096 65,536 5 32 160 1024 32,788 4,294,967,296

Determining Complexity from Experimental Evidence T(n) 2 10 4 17 8 32 16 66 130 n T(n) 2 10 4 8 16 11 32

Determining Complexity from Experimental Evidence T(n) 2 10 4 17 8 32 16 66 130 n T(n) 2 10 4 8 16 11 32 constant complexity c=12 O(1) linear complexity, c=4 O(n)

Determining Complexity from Experimental Evidence T(n) 2 8 4 32 500 16 65,600 4,294,967,300 n T(n) 2 10 4 8 15 16 20 32 25

Determining Complexity from Experimental Evidence T(n) 2 4 8 25 16 64 32 161

Another way to check if the complexity is correct is to divide the experimental time by the complexity. n Actual time n3 Ratio n2 n log n 4 75 64 1.17 16 4.7 8 9.4 219 512 0.43 3.4 24 9.1 1104 4096 0.27 256 4.3 17.2 32 4398 32768 0.13 1024 160 27.5 17632 262144 0.07 384 45.9

One problem that concerns us is, “Is n2 always worse than n logn, no matter what the constants?” 1 2 20 200 4 8 80 800 16 24 240 2,400 64 640 6,400 256 32 160 1,600 16,000 1,024 384 3,840 38,400 4,096 128 896 8,960 89,600 16,384 2,048 20,480 204,800 65,536 512 4,608 46,080 460,800 262,144 10.240 102,400 1,024,000 1,048,576 22,528 225,280 2,252,800 4,194,304 49,152 491,520 4,915,200 16,777,216 8,192 106,496 1,064,960 10,649,600 67,108,864 229,376 2,293,760 22,937,600 268,435,456 32,768 491,5200 49,152,000 1,073,741,824 10,485,760 104,857,600 4,294,967,296