CMSC 100 Efficiency of Algorithms Guest Lecturers: Clay Alberty and Kellie LaFlamme Professor Marie desJardins Tuesday, October 2, 2012 Some material adapted.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CSE Lecture 3 – Algorithms I
Introduction to Algorithms Rabie A. Ramadan rabieramadan.org 2 Some of the sides are exported from different sources.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, Java Version, Third Edition.
CPSC 171 Introduction to Computer Science Efficiency of Algorithms.
The Efficiency of Algorithms
HST 952 Computing for Biomedical Scientists Lecture 10.
Fundamentals of Python: From First Programs Through Data Structures
Chapter 9: Searching, Sorting, and Algorithm Analysis
CPSC 171 Introduction to Computer Science More Efficiency of Algorithms.
Algorithmic Complexity Nelson Padua-Perez Bill Pugh Department of Computer Science University of Maryland, College Park.
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
CS107 Introduction to Computer Science
Chapter 3 The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Fourth Edition.
Chapter 3: The Efficiency of Algorithms
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Elementary Data Structures and Algorithms
Chapter 3: The Efficiency of Algorithms
CS107 Introduction to Computer Science Lecture 7, 8 An Introduction to Algorithms: Efficiency of algorithms.
1 Section 2.3 Complexity of Algorithms. 2 Computational Complexity Measure of algorithm efficiency in terms of: –Time: how long it takes computer to solve.
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
CHAPTER 7: SORTING & SEARCHING Introduction to Computer Science Using Ruby (c) Ophir Frieder at al 2012.
SIGCSE Tradeoffs, intuition analysis, understanding big-Oh aka O-notation Owen Astrachan
HOW TO SOLVE IT? Algorithms. An Algorithm An algorithm is any well-defined (computational) procedure that takes some value, or set of values, as input.
1 Complexity Lecture Ref. Handout p
1 Time Analysis Analyzing an algorithm = estimating the resources it requires. Time How long will it take to execute? Impossible to find exact value Depends.
计算机科学概述 Introduction to Computer Science 陆嘉恒 中国人民大学 信息学院
Computer Science 101 Fast Searching and Sorting. Improving Efficiency We got a better best case by tweaking the selection sort and the bubble sort We.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Analysis of Algorithms
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
Ch 18 – Big-O Notation: Sorting & Searching Efficiencies Our interest in the efficiency of an algorithm is based on solving problems of large size. If.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
CSC 211 Data Structures Lecture 13
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
3.3 Complexity of Algorithms
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
Sorting.
Invitation to Computer Science 6th Edition Chapter 3 The Efficiency of Algorithms.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
Searching Topics Sequential Search Binary Search.
Theory of Computation IS 101Y/CMSC 101Y November 18, 2014 Marie desJardins University of Maryland Baltimore County 1.
Copyright © 2014 Curt Hill Algorithm Analysis How Do We Determine the Complexity of Algorithms.
Algorithm Analysis with Big Oh ©Rick Mercer. Two Searching Algorithms  Objectives  Analyze the efficiency of algorithms  Analyze two classic algorithms.
Analysis of Algorithms
Introduction to Search Algorithms
Introduction to complexity
Sorting by Tammy Bailey
Algorithm Analysis CSE 2011 Winter September 2018.
Teach A level Computing: Algorithms and Data Structures
Algorithm design and Analysis
Chapter 3: The Efficiency of Algorithms
Lecture 6 Efficiency of Algorithms (2) (S&G, ch.3)
Chapter 3: The Efficiency of Algorithms
Analyzing an Algorithm Computing the Order of Magnitude Big O Notation
Invitation to Computer Science 5th Edition
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

CMSC 100 Efficiency of Algorithms Guest Lecturers: Clay Alberty and Kellie LaFlamme Professor Marie desJardins Tuesday, October 2, 2012 Some material adapted from instructor slides for Schneider & Gerstung

2 Overview What makes a good algorithm? Correctness Ease of understanding Elegance Efficiency Computational efficiency – order of magnitude of running time Polynomial – time increases (reasonably) slowly as problem size increases  tractable (solvable in reasonable time) Exponential (or worse!) – time increases explosively as problem size increases  intractable (can’t be solved in practice for big problems) (We are also sometimes interested in memory or space efficiency ) Tue 10/2/12Efficiency of Algorithms

3 Introduction Introduction There are many solutions to any given problem How can we judge and compare algorithms? Analogy: Purchasing a car safety ease of handling style fuel efficiency Evaluating an algorithm correctness ease of understanding elegance time/space efficiency Efficiency of Algorithms Tue 10/2/12

4 Attributes of Algorithms Attributes of interest: correctness, ease of understanding, elegance, and efficiency Correctness: Is the problem specified correctly? Does the algorithm produce the correct result? Example: pattern matching Problem specification: “Given pattern p and text t, determine the location, if any, of pattern p occurring in text t ” Correctness: does the algorithm always work? If p is in t, will it say so? If the algorithm says p is in t, is it? Efficiency of Algorithms Tue 10/2/12

5 Attributes of Algorithms (continued) Ease of understanding, useful for: checking correctness program maintenance Elegance: using a clever or non-obvious approach Example: Gauss’ summing of … Attributes may conflict: Elegance often conflicts with ease of understanding Attributes may reinforce each other: Ease of understanding supports correctness Efficiency of Algorithms Tue 10/2/12

6 Attributes of Algorithms (continued) Efficiency: an algorithm’s use of time and space resources We’ll focus on computational efficiency (time) Timing an algorithm with a clock is not always useful Confounding factors: machine speed, size of input Benchmarking: timing an algorithm on standard data sets Testing hardware and operating system, etc. Testing real-world performance limits Analysis of algorithms: the study of the efficiency of algorithms Order of magnitude Θ () or just O() (“big O”): The class of functions that describes how time increases as a function of problem size (more on which later) Efficiency of Algorithms Tue 10/2/12

Sequential Search Analysis Tue 10/2/12Efficiency of Algorithms

8 Measuring Efficiency Sequential Search Searching: the task of finding a specific value in a list of values, or deciding it is not there Solution #1: Sequential search (from Ch. 2): “Given a target value and a random list of values, find the location of the target in the list, if it occurs, by checking each value in the list in turn” Efficiency of Algorithms Tue 10/2/12

9 Sequential Search Algorithm get (NameList, PhoneList, Name) i = 1 N = length(NameList) Found = FALSE while ( (not Found) and (i <= N) ) { if ( Name == NameList[i] ) { print (Name, “’s phone number is ”, PhoneList[i]) Found = TRUE } i = i+1 } if ( not Found ) { print (Name, “’s phone number not found!”) } Tue 10/2/12Efficiency of Algorithms

10 Measuring Efficiency Sequential Search (continued) Central unit of work: operations that occur most frequently Central unit of work in sequential search: Comparison of target Name to each name in the list Also add 1 to i Typical iteration: two steps (one comparison, one addition) Given a large input list: Best case: smallest amount of work algorithm must do Worst case: greatest amount of work algorithm must do Average case: depends on likelihood of different scenarios occurring What are the best, worst, and average cases for sequential search? Efficiency of Algorithms Tue 10/2/12

11 Measuring Efficiency Sequential Search (continued) Best case: target found with the first comparison ( 1 iteration ) Worst case : target never found or last value ( N iterations) Average case: if each value is equally likely to be searched, work done varies from 1 to N, on average N/2 iterations Most often we will consider the worst case Best case is too lucky – can’t count on it Average case is much harder to compute for many problems (hard to know the distribution of possible solutions) Efficiency of Algorithms Tue 10/2/12

12 Measuring Efficiency Order of Magnitude—Order n Sequential search worst case ( N ) grows linearly in the size of the problem 2 N steps (one comparison and one addition per loop) Also some initialization steps... On the last iteration, we may print something... After the loop, we test and maybe print... To simplify analysis, disregard the “negligible” steps (which don’t happen as often), and ignore the coefficient in 2 N Just pay attention to the dominant term (N) Order of magnitude O(N) : the class of all linear functions (any algorithm that takes C 1 N + C 2 steps for any constants C 1 and C 2 ) Efficiency of Algorithms Tue 10/2/12

13 Binary Search Solution #2: Binary search Assume list is sorted Split the list in half on each iteration On each iteration: If we’ve run out of things to look at, quit Is the centerpoint of the list the name we’re looking for? If so, we’re done! If not, check whether the name is alphabetically after or before the centerpoint of the list If it’s after, take the second half of the list and continue looping If it’s before, take the first half of the list and continue looping Tue 10/2/12Efficiency of Algorithms

14 Binary Search: Algorithm get (NameList, PhoneList, Name) N = length(NameList) upper = N lower = 1 Found = FALSE while ( (not Found) and (lower NameList[(lower+upper)/2] ) { lower = ((lower+upper)/2) + 1 // keep looking BEFORE center } if ( not Found ) { print (Name, “’s phone number not found!”) } Tue 10/2/12Efficiency of Algorithms

15 Measuring Efficiency Binary Search Best case: target found with the first comparison ( 1 iteration ) Worst case : target never found or last value (split the list in half repeatedly until only one item to examine) For a list of length 2, test twice For a list of length 4, test three times (split twice) For a list of length 8, only test four times! (split three times) For a list of length 2 k, how many times to test? For a list of length N, how many times to test? Average case: harder than you would think to analyze... and surprisingly, the average case is only one step better than the worst case Why? Hint: Try drawing a tree of all of the cases (left side, right side after each time the list is split in half) Efficiency of Algorithms Tue 10/2/12

16 Orders of Magnitude: log N Binary search has order of magnitude O(log 2 N) : grows very slowly Here: log 2 (base 2) but other bases behave similarly Tue 10/2/12Efficiency of Algorithms

Sorting Algorithms and Analysis Tue 10/2/12Efficiency of Algorithms

18 Measuring Efficiency Selection Sort Sorting: The task of putting a list of values into numeric or alphabetical order Selection sort : Pass repeatedly over the unsorted portion of the list On each pass, select the largest remaining value Move that value immediately after the other unsorted values Accumulate the largest values, in reverse order, at the end of the list After each iteration, the number of sorted values grows by one and the number of unsorted values shrinks by one Efficiency of Algorithms Tue 10/2/12

19 Efficiency of Algorithms Tue 10/2/12 How long does this step take? (Hint: do we use sequential search or binary search?

20 Measuring Efficiency Selection Sort (continued) Central unit of work: hidden in “find largest” step Work done to find largest changes as unsorted portion shrinks (N-1) + (N-2) + … = N (N-1) / 2 Efficiency of Algorithms Tue 10/2/12

21 Measuring Efficiency Order of Magnitude—Order N 2 Selection sort takes N(N- 1)/2 steps = ½ N 2 – N/2 Order of magnitude: ignore all but the dominant (highest-order) term; ignore the coefficient Order O( N 2 ): the set of functions whose growth is on the order of N 2 Efficiency of Algorithms Tue 10/2/12

22 Measuring Efficiency Order of Magnitude—Order N 2 (continued) Eventually, every function with order N 2 has greater values than any function with order N Tue 10/2/12Efficiency of Algorithms

23 Efficiency of Algorithms Tue 10/2/12

24 Quicksort Quicksort is a faster searching algorithm Divide and conquer approach: Pick a point in the list (the pivot ) Toss everything smaller than the pivot to the left and everything larger to the right Separately sort those two sublists (using quicksort! This is an example of a recursive algorithm, which we’ll talk about more later in the semester...) Quicksort is O (N log N) on average (but can be O( N 2 ) in the worst case...) O (N log N) is slower than linear but faster than quadratic Tue 10/2/12Efficiency of Algorithms

25 Getting Out of Control Polynomially bounded: an algorithm that does work on the order of O( N k ) (or less) linear, log, quadratic,... degree k polynomial Most common problems are polynomially bounded (in “P”) Hamiltonian circuit (“traveling salesman problem”): no known polynomial solution—it is in “NP” Given a graph, find a path that passes through each vertex exactly once and returns to its starting point Efficiency of Algorithms Tue 10/2/12

26 Efficiency of Algorithms Tue 10/2/12 # possible circuits grows exponentially in the number of cities  takes a long time to find the best one!

27 Efficiency of Algorithms Tue 10/2/12

28 Summary We must evaluate the quality of algorithms, and compare competing algorithms to each other Attributes: correctness, efficiency, elegance, and ease of understanding Compare competing algorithms for time and space efficiency (time/space tradeoffs are common) Orders of magnitude capture work as a function of input size: O( log N ), O( N ), O( N 2 ), O( 2 N ) Problems with only exponential algorithms are intractable Efficiency of Algorithms Tue 10/2/12

29 Summary of Complexity Classes Tue 10/2/12Efficiency of Algorithms