CS50 SECTION: WEEK 3 Kenny Yu. Announcements  Watch Problem Set 3’s walkthrough online if you are having trouble.  Problem Set 1’s feedback have been.

Slides:



Advertisements
Similar presentations
Growth-rate Functions
Advertisements

MATH 224 – Discrete Mathematics
Garfield AP Computer Science
SEARCHING AND SORTING HINT AT ASYMPTOTIC COMPLEXITY Lecture 9 CS2110 – Spring 2015 We may not cover all this material.
Practice Quiz Question
Analysys & Complexity of Algorithms Big Oh Notation.
Chapter 1 – Basic Concepts
Scott Grissom, copyright 2004 Chapter 5 Slide 1 Analysis of Algorithms (Ch 5) Chapter 5 focuses on: algorithm analysis searching algorithms sorting algorithms.
 Last lesson  Arrays for implementing collection classes  Performance analysis (review)  Today  Performance analysis  Logarithm.
Elementary Data Structures and Algorithms
Analysis of Algorithms COMP171 Fall Analysis of Algorithms / Slide 2 Introduction * What is Algorithm? n a clearly specified set of simple instructions.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
Algorithms. Problems, Algorithms, Programs Problem - a well defined task. –Sort a list of numbers. –Find a particular item in a list. –Find a winning.
COMP s1 Computing 2 Complexity
Program Performance & Asymptotic Notations CSE, POSTECH.
(C) 2010 Pearson Education, Inc. All rights reserved. Java How to Program, 8/e.
Chapter 19 Searching, Sorting and Big O
CS 1704 Introduction to Data Structures and Software Engineering.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
Chapter 19: Searching and Sorting Algorithms
Recursion Textbook chapter Recursive Function Call a recursive call is a function call in which the called function is the same as the one making.
Algorithm Evaluation. What’s an algorithm? a clearly specified set of simple instructions to be followed to solve a problem a way of doing something What.
 2005 Pearson Education, Inc. All rights reserved Searching and Sorting.
 Pearson Education, Inc. All rights reserved Searching and Sorting.
Searching. RHS – SOC 2 Searching A magic trick: –Let a person secretly choose a random number between 1 and 1000 –Announce that you can guess the number.
C++ Programming: From Problem Analysis to Program Design, Second Edition Chapter 19: Searching and Sorting.
 2006 Pearson Education, Inc. All rights reserved Searching and Sorting.
SortingBigOh Sorting and "Big Oh" Adapted for ASFA from a presentation by: Barb Ericson Georgia Tech Aug 2007 ASFA AP Computer Science.
Sorting and Searching Pepper. Common Collection and Array Actions Sort in a certain order ◦ Max ◦ Min Shuffle Search ◦ Sequential (contains) ◦ Binary.
SortingBigOh ASFA AP Computer Science A. Big-O refers to the order of an algorithm runtime growth in relation to the number of items I. O(l) - constant.
Data Structure Introduction.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Review 1 Selection Sort Selection Sort Algorithm Time Complexity Best case Average case Worst case Examples.
1 Searching and Sorting Searching algorithms with simple arrays Sorting algorithms with simple arrays –Selection Sort –Insertion Sort –Bubble Sort –Quick.
Data Structures - CSCI 102 Selection Sort Keep the list separated into sorted and unsorted sections Start by finding the minimum & put it at the front.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
C++ How to Program, 7/e © by Pearson Education, Inc. All Rights Reserved.
Search Algorithms Written by J.J. Shepherd. Sequential Search Examines each element one at a time until the item searched for is found or not found Simplest.
0 Introduction to asymptotic complexity Search algorithms You are responsible for: Weiss, chapter 5, as follows: 5.1 What is algorithmic analysis? 5.2.
 2006 Pearson Education, Inc. All rights reserved. 1 Searching and Sorting.
Program Performance 황승원 Fall 2010 CSE, POSTECH. Publishing Hwang’s Algorithm Hwang’s took only 0.1 sec for DATASET1 in her PC while Dijkstra’s took 0.2.
329 3/30/98 CSE 143 Searching and Sorting [Sections 12.4, ]
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
1 Algorithms Searching and Sorting Algorithm Efficiency.
CMPT 438 Algorithms.
16 Searching and Sorting.
19 Searching and Sorting.
Sorting and "Big Oh" ASFA AP Computer Science A SortingBigOh.
CS32 Discussion Section 1B Week 8
Analysis of Algorithms
Recitation 13 Searching and Sorting.
Introduction to complexity
COMP 53 – Week Seven Big O Sorting.
Sorting by Tammy Bailey
Teach A level Computing: Algorithms and Data Structures
Big-Oh and Execution Time: A Review
Algorithm design and Analysis
CSS 342 Data Structures, Algorithms, and Discrete Mathematics I
MSIS 655 Advanced Business Applications Programming
CS 201 Fundamental Structures of Computer Science
Analysys & Complexity of Algorithms
24 Searching and Sorting.
Sub-Quadratic Sorting Algorithms
Searching and Sorting Hint at Asymptotic Complexity
Searching and Sorting Hint at Asymptotic Complexity
Algorithm Analysis How can we demonstrate that one algorithm is superior to another without being misled by any of the following problems: Special cases.
Presentation transcript:

CS50 SECTION: WEEK 3 Kenny Yu

Announcements  Watch Problem Set 3’s walkthrough online if you are having trouble.  Problem Set 1’s feedback have been returned  Expect Problem Set 2’s feedback and scores by Friday, and future problem sets feedback on Friday hereafter.  Take advantage of resources online—scribe notes!  Example (Week 3, Monday):  All my resources from section will be posted online here:   Please answer these weekly polls to help me improve section!   My office hours: Tuesdays 9pm-12am, Leverett Dining Hall

Agenda  Recursion  Asymptotic Notation  Search  Linear  Binary  Sort  Insertion  Selection  Bubble  Merge  GDB

What is recursion?

 Recursion – a method of defining functions in which the function being defined is applied within its own definition  A recursive function is a function that calls itself

Components of a recursive function  Recursive Call – part of function which calls the same function again.  Base Case – part of function responsible for halting recursion; this prevents infinite loops

Factorial  Factorial Definition: n! = 1 n * (n-1)! n == 1 otherwise

Factorial int factorial(int num) { // base case! if (num <= 1) return 1; // recursive call! else return num * factorial(num - 1); } Factorial calls itself with a smaller input.

Recursive vs. Iterative RECURSIVE WAY: int factorial(int num) { if (num <= 1) return 1; else return num * factorial(num - 1); } ITERATIVE WAY: int factorial(int num) { int product = 1; for (int i = num; i > 0; i--) product *= num; return product; }

Call Stack Revisited  Each function call pushes a stack frame  So recursive functions repeatedly push on stack frames  Functions higher on the stack must return before functions lower on the stack can return main() func1() func2() func3()

Animation See Animations.ppt Slides 2-3

Recursion vs. Iterative  When should you use recursion?  Sometimes, it may be really hard to do something iteratively (example: descending a binary search tree)  It simplifies your code  When you are already given the recursive definition of a function mathematically  When should you not?  Can potentially lead to a stack overflow (running out of memory on the stack)  But we can get around this by using tail recursion: no extra stack frames are made (learn more about this in CS 51!)

Asymptotic Notation  A way to evaluate efficiency  Every program causes machine instructions to run  Asymptotic notation tells us how many machine instructions have to be run (and therefore how long a program will run) based on the size of the input to the program

Asymptotic Notation  Big Oh notation  O(n) – Worst Case: upper bound on the runtime  Ω (n) – Best Case: lower bound on the runtime  Θ (n) – Average Case: usual runtime  We usually only care about O(n): worst case

Asymptotic Notation O(1) – Constant time O(log n) – Logarithmic time (log base two) O(n) – Linear time O(n log n) - Linear * Logarithmic O(n^2) – Quadratic O(n^p) – Polynomial O(2^n) – Exponential O(n!) - Factorial

Big O In general: (A > B means A is faster than B) Constant > Logarithmic > Linear > Linear * Logarithmic > Quadratic > Polynomial > Exponential > Factorial

Runtime (x is size of input, y is time)

An example  What is the big O for this function with respect to the length of the array? int sum(int array[], int n) { int current_sum = 0; for (int i = 0; i < n; i++) current_sum += i; return current_sum; }

An example  What is the big O for this function with respect to the length of the array? int sum(int array[], int n) { int current_sum = 0; for (int i = 0; i < n; i++) current_sum += i; return current_sum; } Linear time ( O(n) )! We execute n iterations of the loop.

An example  What is the big O for this function with respect to the length of the array? void print_pairs(int array[], int n) { for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { printf(“(%d,%d)\n”,array[i],array[j]); }

An example  What is the big O for this function with respect to the length of the array? void print_pairs(int array[], int n) { for (int i = 0; i < n; i++) { for (int j = 0; j < n; j++) { printf(“(%d,%d)\n”,array[i],array[j]); } Quadratic time ( O(n^2) )! We execute n^2 iterations of the loop.

An example  What is the big O for this function with respect to the length of the array? void print_stuff(int array[], int n) { for (int i = 0; i < 10; i++) { for (int j = 0; j < 10; j++) { printf(“(%d,%d)\n”,i,j); }

An example  What is the big O for this function with respect to the length of the array? void print_stuff(int array[], int n) { for (int i = 0; i < 10; i++) { for (int j = 0; j < 10; j++) { printf(“(%d,%d)\n”,i,j); } Constant time ( O(1) )! We execute 100 iterations of the loop, independent of n.

General Heuristics  A single for or while loop usually indicates linear time O(n)  Two nested loops usually indicates quadratic time O(n^2)  Dividing in half without merging the results of both halves is usually O(log n)  Dividing in half, and then merging the results of the two halves is usually O(n log n)

Linear Search  We iterate through the array from beginning to end, checking whether each element is the element we are looking for  [ 1, 2, 3, 9, 10, 15, 19, 22, 56, 78, 99, 100 ]  What is the big O, with respect to the length of the list?

Linear Search  We iterate through the array from beginning to end, checking whether each element is the element we are looking for  What is the big O, with respect to the length of the list?  O(n): worst case, the element we are looking for is at the end of the list  Ω (1): best case, the element we are looking for is at the beginning of the list

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  Is 78 in our list?  [ 1, 2, 3, 9, 10, 15, 19, 22, 56, 78, 99, 100 ]

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  Is 78 in our list?  [ 1, 2, 3, 9, 10, 15, 19, 22, 56, 78, 99, 100 ]

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  Is 78 in our list?  [ 1, 2, 3, 9, 10, 15, 19, 22, 56, 78, 99, 100 ]

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  Is 78 in our list?  [ 1, 2, 3, 9, 10, 15, 19, 22, 56, 78, 99, 100 ]

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  What is the big O, with respect to the length of the list?

Binary Search: Divide and Conquer  Like searching through a phonebook  We check the middle element; if it is not the element we are looking for, check either the right half or the left half, but not both  What is the big O, with respect to the length of the list?  O(log n): worst case, we divide in half every time  Ω (1): best case, the element we are looking for is the first one we check  NOTE: The array must be sorted before you do a binary search!!

Sorts How can we efficiently place things in order?

Bubble Sort   Made by my former CS50 TF!  The larger elements “bubble” up to the end of the array  O(n^2)  Ω (n) – If you keep track of the number of swaps  Move through the array, left to right  If the current element is less than the element to its right, swap

Insertion Sort   It’s how you sort a hand of cards  Look for smallest card in unsorted part  You insert the card in the correct position in the currently sorted portion of the hand  O(n^2)

Selection Sort   O(n^2)  You find the minimum of the unsorted part of the array  Swap the minimum to its correct position of the array

Merge Sort – Divide and Conquer  Split the array in half  Recursively call merge sort on left half  Recursively call merge sort on right half  Merge the two halfs together We can easily merge two sorted arrays in linear time  O(n log n)  See Animations.ppt Slide 1

GNU Debugger Especially useful when: jharvard$./my_c_program Segmentation Fault WTF is going on here?!?!

GDB – useful commands break – tell the program to ‘pause’ at a certain point (either a function or a line number) step – ‘step’ to the next executed statement next – moves to the next statement WITHOUT ‘stepping into’ called functions continue – move ahead to the next breakpoint print – display some variable’s value backtrace – trace back up function calls

Fun Fun Fun Go to this link: k3.c