1 BIG-O --- Algorithms zPurpose:Be able to evaluate the relative efficiency of various algorithms that are used to process data zWe need to be able to.

Slides:



Advertisements
Similar presentations
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Advertisements

CSE Lecture 3 – Algorithms I
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 2 Some of the sides are exported from different sources.
HST 952 Computing for Biomedical Scientists Lecture 10.
Fundamentals of Python: From First Programs Through Data Structures
ALG0183 Algorithms & Data Structures Lecture 3 Algorithm Analysis 8/25/20091 ALG0183 Algorithms & Data Structures by Dr Andy Brooks Weiss Chapter 5 Sahni.
DIVIDE AND CONQUER APPROACH. General Method Works on the approach of dividing a given problem into smaller sub problems (ideally of same size).  Divide.
Lecture 8 Jianjun Hu Department of Computer Science and Engineering University of South Carolina CSCE350 Algorithms and Data Structure.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Fall 2006CENG 7071 Algorithm Analysis. Fall 2006CENG 7072 Algorithmic Performance There are two aspects of algorithmic performance: Time Instructions.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved L17 (Chapter 23) Algorithm.
1 CSE1301 Computer Programming Lecture 31: List Processing (Search)
CHAPTER 11 Sorting.
Cmpt-225 Algorithm Efficiency.
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Algorithm Efficiency and Sorting Bina Ramamurthy CSE116A,B.
Liang, Introduction to Java Programming, Eighth Edition, (c) 2011 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Analysis of Algorithm.
Elementary Data Structures and Algorithms
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Cmpt-225 Simulation. Application: Simulation Simulation  A technique for modeling the behavior of both natural and human-made systems  Goal Generate.
Abstract Data Types (ADTs) Data Structures The Java Collections API
COMP s1 Computing 2 Complexity
Liang, Introduction to Java Programming, Seventh Edition, (c) 2009 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
For Wednesday Read Weiss chapter 3, sections 1-5. This should be largely review. If you’re struggling with the C++ aspects, you may refer to Savitch, chapter.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
{ CS203 Lecture 7 John Hurley Cal State LA. 2 Execution Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
Chapter 12 Recursion, Complexity, and Searching and Sorting
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
Copyright © 2007 Pearson Education, Inc. Publishing as Pearson Addison-Wesley. Ver Chapter 9: Algorithm Efficiency and Sorting Data Abstraction &
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
CS 221 Analysis of Algorithms Instructor: Don McLaughlin.
Algorithm Analysis PS5 due 11:59pm Wednesday, April 18 Final Project Phase 2 (Program Outline) due 1:30pm Tuesday, April 24 Wellesley College CS230.
Chapter 10 A Algorithm Efficiency. © 2004 Pearson Addison-Wesley. All rights reserved 10 A-2 Determining the Efficiency of Algorithms Analysis of algorithms.
CSE1301 Computer Programming: Lecture 26 List Processing (Search)
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Algorithm Analysis Problem Solving Space Complexity Time Complexity
1 Algorithms  Algorithms are simply a list of steps required to solve some particular problem  They are designed as abstractions of processes carried.
Sorting.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
Efficiency of Algorithms. Node - data : Object - link : Node + createNode() + getData() + setData() + getLink() + setLink() + addNodeAfter() + removeNodeAfter()
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
1 Algorithms Starring: Binary Search Co Starring: Big-O.
Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.
BITS Pilani Pilani Campus Data Structure and Algorithms Design Dr. Maheswari Karthikeyan Lecture1.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Section 1.7 Comparing Algorithms: Big-O Analysis.
Chapter 16: Searching, Sorting, and the vector Type.
Algorithm Analysis 1.
Sorting and "Big Oh" ASFA AP Computer Science A SortingBigOh.
Searching – Linear and Binary Searches
Sorting by Tammy Bailey
Building Java Programs
Algorithm design and Analysis
Algorithm An algorithm is a finite set of steps required to solve a problem. An algorithm must have following properties: Input: An algorithm must have.
COSC 320 Advanced Data Structures and Algorithm Analysis
CS 201 Fundamental Structures of Computer Science
Algorithm Efficiency and Sorting
Presentation transcript:

1 BIG-O --- Algorithms zPurpose:Be able to evaluate the relative efficiency of various algorithms that are used to process data zWe need to be able to evaluate all of the major sort and search algorithms as well as the various implementations of the Abstract Data Types

2 zResources:Java Methods Data StructuresChapter 8 p.191

3 zIntro:As we began to discuss in the lecture on Algorithms, we need to be able to evaluate processes and algorithms based on a uniform criteria. zBig-O provides us with a method for such evaluations

4 BIG-O --- Algorithms zSearching (locating an element with a target value in a list of values) and Sorting are 2 tasks used to illustrate the concept of an algorithm zAlgorithms typically use iterations or recursion. This property differentiates it from straight forward code

5 zAlgorithms are also generic in nature. The same algorithm applies to a whole set of initial states and produces corresponding final states for each of them. zA sorting algorithm, for example, must apply to any list regardless of the values of its elements

6 zFurthermore, an algorithm must be independent of the SIZE of the task ( n )

7 zAnalyze the time efficiency and space requirements of algorithms in an abstract way by looking at an algorithm with regards to specific data types and other implementation details

8 zCriteria to evaluate an algorithm: Space required Amount of time Complexity

9 zSPACE: zNumber and size of simple variables zNumber and total size of components of compound variable zIs space dependent on size of input?

10 zTIME: zNot necessarily measured in real clock time zLook for operation(comparison) zExpress time required in terms of this characteristic operation

11 zCOMPLEXITY: zHas little to do with how complex an algorithm “looks” zfunction of the size(number) of input values zAverage time ybased on probability of the occurrence of inputs zWorst time ybased on most unfavorable input

12 We will focus on time efficiency

13 zEfficiency of TIME: yNumber of Comparisons (if a > b) yNumber of Assignments (a = c)

14 zWe will also disregard the specifics of the hardware or the programming language so:  we will not be measuring time efficiency in real time zwe will not measure in terms of the number of required program instructions/statements

15 zWe will discuss performance in terms of abstract “steps” necessary to complete a task and we assume that each step takes the same amount of time zWe can compare different algorithms that accomplish the same task

16 zThe theory that we can predict the long-term behavior of functions without specific knowledge of the exact constants used in describing the function allows us to ignore constant factors in the analysis of execution time

17 BIG-O --- Big-O Notation zWe can evaluate algorithms in terms of Best Case, Worst Case and Average Case zWe assume that the number of “steps” or n is a large number zAnalyze loops especially nested loops

18 zSequential Search algorithms grow linearly with the size (number) of the elements zWe match the target value against each array value until a match is found or the entire array is scanned

19 zSequential Search z Worst Case is we locate the target element in the last element zBest Case is the target value is found on the first attempt zAverage Case finds the target value in the middle of the array

20 zBinary Search algorithms (compared as applied to the same task as the sequential search) grow logarithmicly with the size (number) of the elements zElements must be ordered zWe compare the target value against the middle element of the array and proceed left (if smaller) or right (larger) until a match if found

21 zBinary Search For example, where n=7, we try a[3] then proceed left or right

22 zBinary Search zWorst Case, Average Case is where we locate the target element in (3 ) log n comparisons (the average case is only one less than the worst case) zBest Case is the target value is found on the first attempt zTHE execution time of a Binary Search is approx. proportional to the log of n

23 z(A) Linear Growth(B) Logarithmic Growth zLogarithmic Growth is SLOWER than Linear Growth zAsymptotically, a Binary Search is FASTER than a Sequential Search as Linear Time eventually surpasses Logarithmic Time (A) (B)

24 zBig-Orepresents the Order of Growth for an Algorithm

25 zGrowth Rate Functions: zreference functions used to compare the rates of growth of algorithms

26 BIG-O --- Big-O Notation Growth Rate Functions zO(1)Constant Time --- time required to process 1 set of steps zThe algorithm requires the same fixed number of steps regardless of the size of the task: yPush and Pop Stack operations yInsert and Remove Queue operations yFinding the median value in a sorted Array

27 zO(n)Linear Time --- increase in time is constant for a larger n (number of tasks) zThe algorithm requires a number of steps proportional to the size of the task z20 tasks = 20

28 zO(n)Examples:  Traversal of a List  Finding min or max element in a List  Finding min or max element in a sequential search of unsorted elements  Traversing a Tree with n nodes  Calculating n-factorial, iterativly  Calculating nth Fibonacci number, iteratively

29 zO(n^2) Quadratic Time zThe number of operations is proportional to the size of the task SQUARED z20 tasks = 400

30 zO(n^2) Examples:  Simplistic Sorting algorithms such as a Selection Sort of n elements  Comparing 2 2-Dimensional arrays, matrics, of size n by n  Finding Duplicates in an unsorted list of n elements (implemented with 2 nested loops)

31 zO(log n) Logarithmic Time --- the log of n is significantly lower than n z20 tasks = 4.3 zUsed in many “divide and conquer” algorithms, like binary search, and is the basis for using binary search trees and heaps

32 zO(log n) zfor example, a binary search tree of one million elements would take, at most, 20 steps to locate a target  Binary Search in a Sorted list of n elements  Insert and Find Operations for a Binary Search Tree with n nodes  Insert and Find Operations for a Heap with n nodes

33 zO(n log n)“n log n” Time z20 tasks = 86.4  More advanced sorting algorithms like Quicksort, Mergesort (which will be discussed in detail in the next chapter)

34 zO(a^n) (a > 1) Exponential Time z20 tasks = 12^20 = very large number  Recursive Fibonacci implementation (a > 3/2)  Towers of Hanoi (a=2)  Generating all permutations of n symbols

35 zThe best time in the preceding growth rate functions is constant time O(1) zThe worst time in the preceding growth rate functions is exponential time O(a^n) which quickly Overwhelms even the fastest computers even for a relatively small n

36 zPolymonial Growth: Linear, quadratic, cubic… The Highest Power of N Dominates the Polymonial (see Ill Lam SDT p.42 Top) zis considered manageable as compared to exponential growth

37 Linear t n Log n Exponential Quadratic N log n Constant O(1)

38 zLog n has a slower asymptotic growth rate when compared to linear growth as a thousand fold increase in the size of the task, n, results in a fixed, moderate increase in the number of operations required.

39 zFor a given ALGORITHM, you can see how it falls on the following grid to determine its “Order of Growth” or time efficiency. zUse this grid as a “Rule of Thumb” when evaluating the BIG-O of an algorithm.

40 zLet this grid help narrow down possible solutions, but make sure you “memorize” the other charts and use them when attacking an order of growth problem zBIG-O Analysis handout has an example where the “rule of thumb” will result in an incorrect assumption

41 BIG-O --- Big-O Notation Rule of Thumb ALGORITHM|SINGLE LOOP|NESTED LOOP(S) ____________ |________________________|__________________ STRAIGHT |LINEAR |QUADRATIC FORWARD|O(N)|O(N^2) PROCESSING||0(N^3)… (SEQUENTIAL) || ____________ |________________________|___________________ || DIVIDE AND |LOGARITHMIC|N LOG N CONQUER|O(LOG N)|O(N LOG N) PROCESSING||

42 Sample Test Times; N = 50,000 FUNCTIONRunning Time Log N (Log Time)15.6 (Binary Search) N(Linear Time)50,000 (L’List or Tree Traversal, Sequential Search) N Log N 780,482 (Quick, MergeSort) N^2(Quadratic Time)2.5 * 10^9 (Selection Sort, Matrics, 2 nested loops) a^N (Exponential)3.8 * 10^21 (recursion, fibonacci, permutations)

43 zREVIEW 3 EXAMPLES IN THE HANDOUT: Lambert p Examples 1.7, 1.8 & 1.9

44 zPROJECTS: zBIG-O Exercises 1 through 5 zWorkbook problems 12 through 20 z Multiple Choice Problems

45 TEST IS THE DAY AFTER THE LABS ARE DUE