Introduction to Computer Science Theory

Slides:



Advertisements
Similar presentations
Order of complexity. Consider four algorithms 1.The naïve way of adding the numbers up to n 2.The smart way of adding the numbers up to n 3.A binary search.
Advertisements

Introduction to Algorithms Quicksort
College of Information Technology & Design
12-Apr-15 Analysis of Algorithms. 2 Time and space To analyze an algorithm means: developing a formula for predicting how fast an algorithm is, based.
CS 1031 Recursion (With applications to Searching and Sorting) Definition of a Recursion Simple Examples of Recursion Conditions for Recursion to Work.
Garfield AP Computer Science
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Chapter 9: Searching, Sorting, and Algorithm Analysis
HST 952 Computing for Biomedical Scientists Lecture 9.
Quicksort, Mergesort, and Heapsort. Quicksort Fastest known sorting algorithm in practice  Caveats: not stable  Vulnerable to certain attacks Average.
Sorting Algorithms and Average Case Time Complexity
Searching and Sorting Topics  Sequential Search on an Unordered File  Sequential Search on an Ordered File  Binary Search  Bubble Sort  Insertion.
Search and Recursion CS221 – 2/23/09. List Search Algorithms Linear Search: Simple search through unsorted data. Time complexity = O(n) Binary Search:
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved L17 (Chapter 23) Algorithm.
CIS 101: Computer Programming and Problem Solving Lecture 6 Usman Roshan Department of Computer Science NJIT.
1 Sorting/Searching CS308 Data Structures. 2 Sorting means... l Sorting rearranges the elements into either ascending or descending order within the array.
Lecture 25 Selection sort, reviewed Insertion sort, reviewed Merge sort Running time of merge sort, 2 ways to look at it Quicksort Course evaluations.
Sorting and Searching. Searching List of numbers (5, 9, 2, 6, 3, 4, 8) Find 3 and tell me where it was.
Wednesday, 11/25/02, Slide #1 CS 106 Intro to CS 1 Wednesday, 11/25/02  QUESTIONS??  Today:  More on sorting. Advanced sorting algorithms.  Complexity:
Quicksort. Quicksort I To sort a[left...right] : 1. if left < right: 1.1. Partition a[left...right] such that: all a[left...p-1] are less than a[p], and.
Quicksort.
Quicksort
CS 104 Introduction to Computer Science and Graphics Problems Data Structure & Algorithms (3) Recurrence Relation 11/11 ~ 11/14/2008 Yang Song.
CSC 2300 Data Structures & Algorithms March 20, 2007 Chapter 7. Sorting.
Searching Arrays Linear search Binary search small arrays
ALGORITHM ANALYSIS AND DESIGN INTRODUCTION TO ALGORITHMS CS 413 Divide and Conquer Algortihms: Binary search, merge sort.
COMP s1 Computing 2 Complexity
Asymptotic Notations Iterative Algorithms and their analysis
Iterative Algorithm Analysis & Asymptotic Notations
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
Searching and Sorting Topics Sequential Search on an Unordered File
CHAPTER 09 Compiled by: Dr. Mohammad Omar Alhawarat Sorting & Searching.
SEARCHING UNIT II. Divide and Conquer The most well known algorithm design strategy: 1. Divide instance of problem into two or more smaller instances.
Merge Sort. What Is Sorting? To arrange a collection of items in some specified order. Numerical order Lexicographical order Input: sequence of numbers.
 2005 Pearson Education, Inc. All rights reserved Searching and Sorting.
 Pearson Education, Inc. All rights reserved Searching and Sorting.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
SEARCHING. Vocabulary List A collection of heterogeneous data (values can be different types) Dynamic in size Array A collection of homogenous data (values.
Big Oh Algorithms are compared to each other by expressing their efficiency in big-oh notation Big O notation is used in Computer Science to describe the.
CSC 211 Data Structures Lecture 13
Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy Walters, and Godfrey Muganda Modified for use by MSU Dept. of Computer Science.
1 Algorithms CS 202 Epp section ??? Aaron Bloomfield.
Searching and Sorting Recursion, Merge-sort, Divide & Conquer, Bucket sort, Radix sort Lecture 5.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Sorting CS Sorting means... Sorting rearranges the elements into either ascending or descending order within the array. (we’ll use ascending order.)
Sorting and Searching. Selection Sort  “Search-and-Swap” algorithm 1) Find the smallest element in the array and exchange it with a[0], the first element.
Liang, Introduction to Java Programming, Sixth Edition, (c) 2007 Pearson Education, Inc. All rights reserved Chapter 23 Algorithm Efficiency.
Course Code #IDCGRF001-A 5.1: Searching and sorting concepts Programming Techniques.
ICS201 Lecture 21 : Sorting King Fahd University of Petroleum & Minerals College of Computer Science & Engineering Information & Computer Science Department.
Searching Topics Sequential Search Binary Search.
1 Ch. 2: Getting Started. 2 About this lecture Study a few simple algorithms for sorting – Insertion Sort – Selection Sort (Exercise) – Merge Sort Show.
Concepts of Algorithms CSC-244 Unit 15 & 16 Divide-and-conquer Algorithms ( Binary Search and Merge Sort ) Shahid Iqbal Lone Computer College Qassim University.
Review Quick Sort Quick Sort Algorithm Time Complexity Examples
 2006 Pearson Education, Inc. All rights reserved. 1 Searching and Sorting.
Copyright © 2011 Pearson Education, Inc. Publishing as Pearson Addison-Wesley Starting Out with C++ Early Objects Seventh Edition by Tony Gaddis, Judy.
SORTING AND ASYMPTOTIC COMPLEXITY Lecture 13 CS2110 – Fall 2009.
Sorting Algorithms. Algorithms, revisited What is an algorithm? Wikipedia Definition: an algorithm is a definite list of well-defined instructions for.
Section 1.7 Comparing Algorithms: Big-O Analysis.
1 Algorithms Searching and Sorting Algorithm Efficiency.
16 Searching and Sorting.
Growth of Functions & Algorithms
COP 3503 FALL 2012 Shayan Javed Lecture 15
Design and Analysis of Algorithms
Algorithms.
Presentation transcript:

Introduction to Computer Science Theory

How complex is a program (or algorithm)? Algorithms is the study of What algorithms can be used to solve common problems? How fast/slow are those algorithms? For example: Binary search on an array of size m takes about log2(m) steps. Linear search on an array of size m takes about m steps. Also, the sorting problem: BubbleSort on an array of size m takes about m2 steps. MergeSort on an array of size m takes about m log2(m) steps.

How complex is a problem? Computational Complexity is the study of how hard a (computational) problem is. In other words, how fast/slow is the fastest algorithm that solves the problem? For example, the searching problem: Searching an unsorted array: the best known algorithm is linear search, which takes about m steps. Searching a sorted array: the best known algorithm is binary search, which takes about log2 m steps. Also, the sorting problem: The best known algorithms are MergeSort, HeapSort, and others, which all take about m log2(m) steps.

What do you mean, “about”? Computer Scientists study how fast algorithms are approximately, in the worst case. For example, for an array of size m: Algorithm 1 takes 2 m steps in the worst case Algorithm 2 takes 3 m + 40 steps in the worst case Then we say Algorithm 1 and 2 are both about the same!

Tangent: Moore’s Law Manufacturers double the speed of computers roughly every 1.5 years. This has held true since the first computers (~1950) So if you’re trying to decide on an algorithm, and the constant factor for one seems high, wait 1.5 years and it will be cut in half.

Defining “approximately” We say f(m) = O(g(m)) if: There are constants a and M such that This is called “Big-O” notation e.g., 3m + 40 = O(m) (read: “Big-O of m” or “O of m”) Likewise, 2m = O(m) So, 3m + 40 and 2m are basically the same 3m2 + 7m + 2 is not O(m), but 3m2 + 7m + 2 = O(m2).

Recall: Linear Search # Input: Array D, integer key # Output: first index of key in D, # or -1 if not found For i := 0 to end of D: if D[i] equals key: return i return -1

Recall: Binary Search Algorithm # Input: Sorted Array D, integer key # Output: first index of key in D, or -1 if not found left = 0, right = index of last element while left <= right: middle = index halfway between left, right if D[middle] matches key: return middle else if key comes before D[middle]: // D is sorted right = middle -1 else: left = middle + 1 return -1

How much faster is binary search? Way, way faster Assuming the array is already sorted But precisely how much? For an array of size: Linear search might visit: Binary search might visit: 24 = 16 elements 4+1 = log2(16)+1 28 = 256 256 elements 8+1 = log2(256)+1 212 = 4096 4096 elements 12+1 = log2(4096)+1 2n = m elements m elements n + 1 = log2(m) + 1

Recall: BubbleSort # Input: an array of elements called D # Output: D is sorted performedSwap = true while we performed at least one swap: performedSwap = false for i goes from 0 to the end of D, less 1: if D[i] > D[i+1]: swap D[i] and D[i+1]

Recall: The MergeSort Algorithm # input array D, output sorted D mergeSize = 1 while mergeSize < length of D: i = 0 while i < length of D: middle = i + mergeSize right = the smaller of i + 2*mergeSize, length of D merge(D, i, middle-1, middle, right-1) i = right mergeSize = mergeSize * 2

Recall: The Merge Algorithm # input array D, with two sorted sub-arrays # from left1 to right1, from left2 to right2 # output array D, with one sorted sub-array # from left1 to right2 function merge(D, left1, right1, left2, right2): Temp = an array big enough to hold both subarrays i1 = left1, i2 = left2, iTemp = 0 while i1 <= right 1 and i2 <= right2: if D[i1] < D[i2]: Temp[iTemp] = D[i1] increment i1 else: Temp[iTemp] = D[i2] increment i2 increment iTemp copy Temp back to the right positions of D

How much faster is MergeSort? Way, way faster (on big arrays) But precisely how much? For an array of size: BubbleSort comparisons: MergeSort comparisons: Speedup 24 = 16 ~ 162 = 256 ~ 16 * 4 = 64 x 4 28 = 256 ~ 2562 = 65,536 ~ 256 * 8 = 4,096 x 32 212 = 4096 ~ 4,0962 = 16,777,216 ~ 4096 * 12 = 49,152 x 341 2n = m elements ~ m2 ~ m * log m x (m / log m)

Complexity Classes The Big-O notation helps us group algorithms into classes with similar speed For example, MergeSort and QuickSort both belong to the class with speed O(m log2(m)) Common classes of algorithms: Speed Example O(log2(m)) Binary search O(m) Linear search O(m log2(m)) MergeSort, QuickSort (expected case) O(m2) BubbleSort, QuickSort (worst case) O(m3) Naïve Matrix multiplication O(2m) Finding the shortest route to m different cities

Exercise Let’s say you have implementations of linearSearch, binarySearch, and mergeSort with the following worst-case complexities: linearSearch: m binarySearch: log2 m mergeSort: 2m log2 m + 5 What’s faster (worst case) on an array with 1000 elements: Running linearSearch once, or running mergeSort followed by binarySearch? Running linearSearch 20 times, or running mergeSort followed by binarySearch 20 times?

Remember: Algorithmic Thinking (aka, Computational Thinking) Programming is a skill you will use in all computer science courses Programming != Computer Science Algorithmic Thinking is a collection of different ways that computer scientists think about problems It’s what you mostly learn about in computer science classes, when you’re done learning core programming techniques

What you should know When comparing algorithms, computer scientists analyze approximate worst-case performance Big-O notation groups algorithms into complexity classes (eg, O(m), O(m2)) Remember the complexity classes for: Linear search, binary search BubbleSort, MergeSort