Practicum 2: - Asymptotics - List and Tree Structures 15-211 Fundamental Data Structures and Algorithms Klaus Sutner Feb. 5, 2004.

Slides:



Advertisements
Similar presentations
Chapter 20 Computational complexity. This chapter discusses n Algorithmic efficiency n A commonly used measure: computational complexity n The effects.
Advertisements

Algorithms Algorithm: what is it ?. Algorithms Algorithm: what is it ? Some representative problems : - Interval Scheduling.
Intro to Analysis of Algorithms. Algorithm “A sequence of unambiguous instructions for solving a problem, i.e., for obtaining a required output for any.
 The running time of an algorithm as input size approaches infinity is called the asymptotic running time  We study different notations for asymptotic.
Chapter 1 – Basic Concepts
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
CSC1016 Coursework Clarification Derek Mortimer March 2010.
Big-O and Friends. Formal definition of Big-O A function f(n) is O(g(n)) if there exist positive numbers c and N such that: f(n) = N Example: Let f(n)
Asymptotic Growth Rate
CS 206 Introduction to Computer Science II 09 / 12 / 2008 Instructor: Michael Eckmann.
CS 206 Introduction to Computer Science II 10 / 14 / 2009 Instructor: Michael Eckmann.
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
Cmpt-225 Algorithm Efficiency.
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
Computer Science 2 Data Structures and Algorithms V section 2 Intro to “big o” Lists Professor: Evan Korth New York University 1.
Data Structures and Algorithms1 Basics -- 2 From: Data Structures and Their Algorithms, by Harry R. Lewis and Larry Denenberg (Harvard University: Harper.
Elementary Data Structures and Algorithms
Computer Science 2 Data Structures and Algorithms V Intro to “big o” Lists Professor: Evan Korth New York University 1.
Data Structures Introduction Phil Tayco Slide version 1.0 Jan 26, 2015.
COMP s1 Computing 2 Complexity
MA/CSSE 473 Day 03 Asymptotics A Closer Look at Arithmetic With another student, try to write a precise, formal definition of “t(n) is in O(g(n))”
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Comp 249 Programming Methodology Chapter 15 Linked Data Structure - Part B Dr. Aiman Hanna Department of Computer Science & Software Engineering Concordia.
Basic Concepts 2014, Fall Pusan National University Ki-Joune Li.
Analysis of Algorithms
Chapter 3 Sec 3.3 With Question/Answer Animations 1.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Asymptotic Analysis-Ch. 3
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
Complexity, etc. Homework. Comparison to computability. Big Oh notation. Sorting. Classwork/Homework: prepare presentation on specific sorts. Presentation.
1 Practicum 2: More Car Dodging Trees & Lists Fundamental Data Structures and Algorithms Margaret Reid-Miller 5 February 2004.
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Copyright © 2014 Curt Hill Growth of Functions Analysis of Algorithms and its Notation.
Sorting: Implementation Fundamental Data Structures and Algorithms Klaus Sutner February 24, 2004.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Chapter 2 Computational Complexity. Computational Complexity Compares growth of two functions Independent of constant multipliers and lower-order effects.
1/6/20161 CS 3343: Analysis of Algorithms Lecture 2: Asymptotic Notations.
CS 150: Analysis of Algorithms. Goals for this Unit Begin a focus on data structures and algorithms Understand the nature of the performance of algorithms.
Binary Search Trees1 Chapter 3, Sections 1 and 2: Binary Search Trees AVL Trees   
Week 13 - Wednesday.  What did we talk about last time?  NP-completeness.
Introduction to Algorithm Analysis Concepts Fundamental Data Structures and Algorithms Aleks Nanevski and Margaret Reid-Miller January 15, 2003.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
Sorting Algorithms Written by J.J. Shepherd. Sorting Review For each one of these sorting problems we are assuming ascending order so smallest to largest.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Basic Concepts 2011, Fall Pusan National University Ki-Joune Li.
MA/CSSE 473 Day 05 More induction Factors and Primes Recursive division algorithm.
Advanced Algorithms Analysis and Design By Dr. Nazir Ahmad Zafar Dr Nazir A. Zafar Advanced Algorithms Analysis and Design.
2017, Fall Pusan National University Ki-Joune Li
Chapter 2 Algorithm Analysis
Introduction to Algorithms
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
Week 2 - Friday CS221.
CS 3343: Analysis of Algorithms
Algorithm design and Analysis
2018, Fall Pusan National University Ki-Joune Li
CS 201 Fundamental Structures of Computer Science
Advanced Analysis of Algorithms
Binary Trees (and Big “O” notation)
Chapter 2.
PAC Intro to “big o” Lists Professor: Evan Korth New York University
At the end of this session, learner will be able to:
Discrete Mathematics 7th edition, 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
CSE 373 Set implementation; intro to hashing
2019, Fall Pusan National University Ki-Joune Li
Presentation transcript:

Practicum 2: - Asymptotics - List and Tree Structures Fundamental Data Structures and Algorithms Klaus Sutner Feb. 5, 2004

Today Is really easy, just a gentle review of things you already know. - Asymptotic notation, yet again. - Implementing nested lists and trees. Hidden agenda: inductive thinking.

Fudging It Running time analysis very often leads to more or less intractable problems: counting steps even in very simple programs is just hopelessly complicated. Trying to get precise answers is also really quite useless in most cases. It's better to ignore details and focus on the “large picture”.

Upper And Lower Bounds f(n) = O( g(n) )Big-Oh f(n) ≤ c g(n) for some constant c and almost all n f(n) =  ( g(n) ) Big-Omega f(n) ≥ c g(n) for some constant c and almost all n f(n) =  ( g(n) ) Theta f(n) = O( g(n) ) and f(n) =  ( g(n) )

Upper And Lower Bounds f(n) = O( g(n) )Big-Oh can only be used for upper bounds f(n) =  ( g(n) ) Big-Omega can only be used for lower bounds f(n) =  ( g(n) ) Theta pins down the running time exactly (up to a multiplicative constant).

Important Classes O( 1 )constant O( n )linear O( n log n )n-log-n O( n 2 )quadratic O( n 3 )cubic O( 2 c n )exponential (some authors differ) O( n! )factorial

Holy Grail O( n k )polynomial versus O( 2 c n )exponential (some authors differ) Anything that can be done in polynomial time is tractable, feasible, doable. But note that there are constants lurking in the dard. Empirical Fact: They don't seem to matter much.

Comparison f(n) = o( g(n) ) lim f(n)/g(n) = 0 Interpretation: g is significantly worse that f. Example: f(n) = o( n 2 )sub-quadratic Often a big challenge to find an algorithm that is sub-xxxx.

Similarity f(n) ~ g(n) lim f(n)/g(n) = 1 Interpretation: f and g are essentially indistinguishable. This is much stronger than. Example: n ~ n + 1/n

Approximations Used mostly to assert the quality of an approximation. H n ~ log n +  Here  is the Euler-Mascheroni constant  ≈

More Recursive Data Structures Inductive thinking is often the best way to tackle complicated data structures. Plain linked lists are a cheap example, but not convincing: everybody knows how to hack linked lists, induction be damned. Let's try something more ambitious: Nested Lists and Binary Trees.

Nested Lists In an ordinary lists, only atomic elements such as integers can be stored. How about allowing lists of lists of lists... of integers? ( 1, 2, ( 3, 4 ), ((5)), 6 ) How hard would it be to design this type of data structure? What basic operations should we use? How does it compare to other structures such as trees?

Basic Operations 1. How do we construct such a nested list? What is the inductive structure here? 2. Suppose we already have built such a nested list. What are the access operations we need to get at the pieces? 3. How do we deal with operations such as flattening?

Basic Operations How do we implement this using Java's language features? MAW: null is not such a great idea. In the OO framework everything should be a class, even an empty list. The root concept List appears in several incarnations: empty, with leading int, with leading list.

Basic Structure There are three cases to consider: nilEmptyList ( 12345, (...) )IntList ( (...), (...) ) NestList Use a small hierarchy.

Access We only need the standard first/rest machinery. Every position in a nested list can be accessed by a sequence of first/rest operations. Note that simple iterators don't quite work here: we need to be able to go forward or down: ( (5,6), 2, 3, 4 )

Flattening Intuitively, the flatten operation is easy: ( (5,6), 2, ((3)), 4 ) --> (5,6,2,3,4) Domain: nested lists, codomain: simple lists. We may assume we have the usual operations on simple lists available. So how does flatten act on a nested list?

Flattening ( (5,6), 2, ((3)), 4 ) --> (5,6,2,3,4) flatten(nil) = nil flatten( (x,R) ) = prepend( flatten(R), x ) flatten( (L,R) ) = join( flatten(L), flatten(R) )

Binary Trees Really ordered binary trees: every child is either left or right (even when the other child is missing). Information can be stored only at all nodes (for simplicity, let's just say an integer can be stored). Intuitively, it should be clear that this DS is more “powerful” than just linked lists. Right?

Pretty Pictures nil

Basic Choices Note that this is really quite similar to the nested list construction. Again we build a small class hierarchy: Root concept Tree appears in incarnations: nilempty T(a,L,R)interior node

Lists as Trees There is a natural way to represent flat lists as trees.

Lists as Trees There is a natural way to represent nested lists as trees.

How To Convert? How would a conversion function work? l2t( nil ) = nil l2t( (x,R) ) = T( -, T(x,nil,nil), l2t(R) ) l2t( (L,R) ) = T( -, l2t(L), l2t(R) )

How To Convert? Looks a bit inelegant. Better would be l2t( nil ) = nil l2t( (x,R) ) = T( x, l2t(R) ) l2t( (L,R) ) = T( l2t(L), l2t(R) ) What does this require in terms of the trees?

Proper Trees Note that not every tree can be the result of converting a nested list. How do we check whether a tree is obtained by converting a list? Let's call these trees proper. We want a function proper : trees --> {true,false}

Who'se More Powerful? It looks like trees are more powerful than nested lists. Are they really? Suppose Fritz Hackmann has this absolutely fantastic implementation of nested lists. Francoise Haquer needs a tree implementation, and fast. Could she somehow use Fritz's code?

What Does This Mean? At the very least, we need to be able to convert trees into nested lists. There has to be an injective map t2l : trees --> lists Let's not worry about efficiency at this point. --> ( (a,b),c,d,(e))

Converting Back t2l( nil ) = nil t2l( T( -, T(x,nil,nil), R ) ) = ( x, t2l(R) ) t2l( T( -, L, R ) ) = ( ( t2l(L) ), t2l(R) ) Take this with a grain of salt, there should be list operations on the right.

One More Picture A 3 by 3 matrix as a tree. Should add() be destructive (choice 1) or not (choice 2)?Does it matter? Why or why not?Should add() be destructive (choice 1) or not (choice 2)?Does it matter? Why or why not?