Recitation 10 Prelim Review.

Slides:



Advertisements
Similar presentations
Exam Review 3 Chapters 10 – 13, 15 CSC212 Section FG CS Dept, CCNY.
Advertisements

Priority Queues. 2 Priority queue A stack is first in, last out A queue is first in, first out A priority queue is least-first-out The “smallest” element.
CS2336: Computer Science II
1 Hash Tables  a hash table is an array of size Tsize  has index positions 0.. Tsize-1  two types of hash tables  open hash table  array element type.
1 Big-Oh Notation CS 105 Introduction to Data Structures and Algorithms.
Mathematics Review and Asymptotic Notation
Data Structure Introduction.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
RUNNING TIME 10.4 – 10.5 (P. 551 – 555). RUNNING TIME analysis of algorithms involves analyzing their effectiveness analysis of algorithms involves analyzing.
Recitation on analysis of algorithms. Formal definition of O(n) We give a formal definition and show how it is used: f(n) is O(g(n)) iff There is a positive.
Week 10 - Friday.  What did we talk about last time?  Graph representations  Adjacency matrix  Adjacency lists  Depth first search.
CISC220 Fall 2009 James Atlas Dec 07: Final Exam Review.
Computability Sort homework. Formal definitions of time complexity. Big 0. Homework: Exercises. Searching. Shuffling.
1 Priority Queues (Heaps). 2 Priority Queues Many applications require that we process records with keys in order, but not necessarily in full sorted.
Final Exam Review COP4530.
Analysis of Algorithms and Inductive Proofs
Introduction to Data Structures
"Teachers open the door, but you must enter by yourself. "
CSE373: Data Structure & Algorithms
CSC317 Selection problem q p r Randomized‐Select(A,p,r,i)
Recitation 9 Prelim Review.
May 17th – Comparison Sorts
Recitation 13 Searching and Sorting.
Scalability for Search
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
Priority Queues and Heaps
Week 2 - Friday CS221.
Week 11 - Friday CS221.
Hashing Exercises.
DATA STRUCTURES Introduction: Basic Concepts and Notations
Complexity Analysis.
Cse 373 April 26th – Exam Review.
November 1st – Exam Review
Efficiency add remove find unsorted array O(1) O(n) sorted array
Review for Midterm Neil Tang 03/04/2010
Hashing CS2110 Spring 2018.
Recitation 7 Hashing.
original list {67, 33,49, 21, 25, 94} pass { } {67 94}
i206: Lecture 8: Sorting and Analysis of Algorithms
Analysis of Algorithms
Algorithm design and Analysis
Data Structures and Algorithms
Complexity Present sorting methods. Binary search. Other measures.
Hashing CS2110.
Final Exam Review COP4530.
Chapter 21 Hashing: Implementing Dictionaries and Sets
B-Tree Insertions, Intro to Heaps
Priority Queues.
Data Structures Review Session
MSIS 655 Advanced Business Applications Programming
CS 201 Fundamental Structures of Computer Science
Analysys & Complexity of Algorithms
Runtime evaluation of algorithms
Sub-Quadratic Sorting Algorithms
Introduction to Data Structures
Algorithms Analysis Algorithm efficiency can be measured in terms of:
Priority Queues & Heaps
Revision of C++.
Intro to Data Structures
DATA STRUCTURE.
8. Comparison of Algorithms
Recitation 10 Prelim Review.
slides created by Marty Stepp
At the end of this session, learner will be able to:
Sum this up for me Let’s write a method to calculate the sum from 1 to some n public static int sum1(int n) { int sum = 0; for (int i = 1; i
Priority Queues.
Complexity Analysis (Part II)
17CS1102 DATA STRUCTURES © 2018 KLEF – The contents of this presentation are an intellectual and copyrighted property of KL University. ALL RIGHTS RESERVED.
Presentation transcript:

Recitation 10 Prelim Review

Big O See the Study Habits Note @282 on the course Piazza. There is a 2-page pdf file that says how to learn what you need to know for O-notation.

Big O definition Is merge sort O(n3)? c * g(n) f(n) is O(g(n)) iff There is a positive constant c and a real number N such that: f(n) ≤ c * g(n) for n ≥ N f(n) Explain the Big O definition and explain that this is why constants don’t matter. c * g(n) is our upper bound and we can set c to be any real valued number. Note: We don’t say f(n) = O(g(n)) because it is not an equality. Yes merge sort is O(n^3) technically, but when big O is used, it usually means it is the tightest bound. n Is merge sort O(n3)? Yes, but not tightest upper bound N

Review: Big O Is used to classify algorithms by how they respond to changes in input size n. Important vocabulary: Constant time: O(1) Logarithmic time: O(log n) Linear time: O(n) Quadratic time: O(n2) Let f(n) and g(n) be two functions that tell how many statements two algorithms execute when running on input of size n. f(n) >= 0 and g(n) >= 0. For all of the vocab words, go over an example, Adding two numbers - O(1) binary search - O(log n) linear search - O(n) selection sort - O(n^2)

Review: Informal Big O rules Usually: O(f(n)) × O(g(n)) = O(f(n) × g(n)) – Such as if something that takes g(n) time for each of f(n) repetitions . . . (loop within a loop) 2. Usually: O(f(n)) + O(g(n)) = O(max(f(n), g(n))) – “max” is whatever’s dominant as n approaches infinity – Example: O((n2-n)/2) = O((1/2)n2 + (-1/2)n) = O((1/2)n2) = O(n2) 3. Why don’t logarithm bases matter? –For constants x, y: O(logx n) = O((logx y)(logy n)) –Since (logx y) is a constant, O(logx n) = O(logy n) Test will not require understanding such rules for logarithms

Review: Big O 1. log(n) + 20 is O(log(n)) (logarithmic) 2. n + log(n) is 3. n/2 and 3*n are 4. n * log(n) + n is 5. n2 + 2*n + 6 is 6. n3 + n2 is 7. 2n + n5 is O(log(n)) (logarithmic) O(n) (linear) O(n) O(n * log(n)) O(n2) (quadratic) O(n3) (cubic) O(2n) (exponential)

Analysis of Algorithms Review: Big O examples What is the runtime of an algorithm that runs insertion sort on an array O(n2) and then runs binary search O(log n) on that now sorted array? What is the runtime of finding and removing the fifth element from a linked list? What if in the middle of that remove operation we swapped two integers exactly 100000 times, what is the runtime now? What is the runtime of running merge sort 4 times? n times?

Binary Search Trees

Left child is always smaller than parent Right child is always larger than parent 89 56 98 94 12 79 Without this node, the tree would be complete How would you change the tree to add the value 37? 80

Heaps

Array Representation of Binary Heap min heap array 1 1 2 2 99 99 ? 1 2 ? ? 99 4 ? 3 ? 1 2 3 4 4 4 3 3

Review: Binary heap PriorityQueue min heap max heap PriorityQueue Maintains max or min of collection (no duplicates) Follows heap order invariant at every level Always balanced! worst case: O(log n) insert O(log n) update O(1) peek O(log n) removal 1 99 2 99 4 1 4 3 2 3

Review: Binary heap How do we insert element 0 into the min heap? After we remove the root node, what is the resulting heap? How are heaps usually represented? If we want the right child of index i, how do we access it? 1 2 99 4 3

Hashing

HashSet<String> Review: Hashing load factor, for open addressing: number of non-null entries ---------------------------------------- size of array load factor, for chaining: size of set If load factor becomes > 1/2, create an array twice the size and rehash every element of the set into it, use new array HashSet<String> MA NY CA 1 2 3 4 5 Method Expected Runtime Worst Case add O(1) O(n) contains remove contains and remove: O(n) due to possible poor hash function

HashSet<String> HashMap<String,Integer> Review: Hashing HashSet<String> HashMap<String,Integer> MA NY CA 1 2 3 4 5 to 2 be 2 or 1 Method Expected Runtime Worst Case add O(1) O(n) contains remove not 1 contains and remove: O(n) due to possible poor lhash function that 1 is 1 the 1 question 1

Review: Hashing Idea: finding an element in an array takes constant time when you know which index it is stored in Hash Function value int Hash function should be O(1) to reap the benefits of hashing This is where the magic is to get our O(n) operations down to amortized O(1) 1 2 3 4 5 b

Collision resolution Two ways of handling collisions: Chaining 2. Open Addressing Give a high-high-level of both approaches here.

Load factor: b’s saturation add(“MA”) Hash Function MA MA NY VA 1 2 3 4 5 b

Question: Hashing Using linear probing to resolve collisions, Add element SC (hashes to 9). Remove VA (hashes to 3). Check to see if MA (hashes to 21) is in the set. What should we do if we override equals()? MA NY VA 1 2 3 4 5 b

Graphs

Question: What is BFS and DFS? C F Starting from node A, run BFS and DFS to find node Z. What is the order that the nodes were processed in? Visit neighbors in alphabetical order. What is the difference between DFS and BFS? What algorithm would be better to use if our graph were near infinite and a node was nearby? Is Dijkstra’s more like DFS or BFS? Why? Can you run topological sort on this graph?

All edges go from a smaller-numbered node to a larger-numbered node. Topological ordering 5 2 1 4 3 6 All edges go from a smaller-numbered node to a larger-numbered node. How can this be useful?

Dijkstra’s Algorithm 5 4 1 5 2 6 3 8 2 1 3 6 7 The nodes are numbered in the order they are visited if we start at 1. Why are they visited in this order?