CSCI 62 Data Structures Dr. Joshua Stough September 11, 2008.

Slides:



Advertisements
Similar presentations
Razdan with contribution from others 1 Algorithm Analysis What is the Big ‘O Bout? Anshuman Razdan Div of Computing.
Advertisements

Fundamentals of Python: From First Programs Through Data Structures
Chapter 1 – Basic Concepts
Object (Data and Algorithm) Analysis Cmput Lecture 5 Department of Computing Science University of Alberta ©Duane Szafron 1999 Some code in this.
Introduction and a Review of Basic Concepts
1 Algorithm Efficiency, Big O Notation, and Role of Data Structures/ADTs Algorithm Efficiency Big O Notation Role of Data Structures Abstract Data Types.
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
Analysis of Algorithms 7/2/2015CS202 - Fundamentals of Computer Science II1.
The Complexity of Algorithms and the Lower Bounds of Problems
Elementary Data Structures and Algorithms
Analysis of Algorithms Spring 2015CS202 - Fundamentals of Computer Science II1.
Abstract Data Types (ADTs) Data Structures The Java Collections API
Algorithm Cost Algorithm Complexity. Algorithm Cost.
Asymptotic Growth Rates Themes –Analyzing the cost of programs –Ignoring constants and Big-Oh –Recurrence Relations & Sums –Divide and Conquer Examples.
Asymptotic Notations Iterative Algorithms and their analysis
Time Complexity Dr. Jicheng Fu Department of Computer Science University of Central Oklahoma.
Analysis of Algorithm Lecture 3 Recurrence, control structure and few examples (Part 1) Huma Ayub (Assistant Professor) Department of Software Engineering.
Introduction to Analysing Costs 2015-T2 Lecture 10 School of Engineering and Computer Science, Victoria University of Wellington  Marcus Frean, Rashina.
Lecture 8. How to Form Recursive relations 1. Recap Asymptotic analysis helps to highlight the order of growth of functions to compare algorithms Common.
A Computer Science Tapestry 1 Recursion (Tapestry 10.1, 10.3) l Recursion is an indispensable technique in a programming language ä Allows many complex.
SEARCHING, SORTING, AND ASYMPTOTIC COMPLEXITY Lecture 12 CS2110 – Fall 2009.
1 Chapter 24 Developing Efficient Algorithms. 2 Executing Time Suppose two algorithms perform the same task such as search (linear search vs. binary search)
1 Recursion Algorithm Analysis Standard Algorithms Chapter 7.
Week 2 - Monday.  What did we talk about last time?  Exceptions  Threads  OOP  Interfaces.
CSCI 62 Data Structures Dr. Joshua Stough September 9, 2008.
Programming Principles II Lecture Notes 5 Recursion Andreas Savva.
Chapter 12 Recursion, Complexity, and Searching and Sorting
Analysis of Algorithms
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data Depends.
SEARCHING. Vocabulary List A collection of heterogeneous data (values can be different types) Dynamic in size Array A collection of homogenous data (values.
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
1 7.Algorithm Efficiency What to measure? Space utilization: amount of memory required  Time efficiency: amount of time required to process the data.
CMPT 438 Algorithms. Why Study Algorithms? Necessary in any computer programming problem ▫Improve algorithm efficiency: run faster, process more data,
Asymptotic Analysis-Ch. 3
Week 12 - Wednesday.  What did we talk about last time?  Asymptotic notation.
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
1 Lecture 5 Generic Types and Big O. 2 Generic Data Types A generic data type is a type for which the operations are defined but the types of the items.
CSC 211 Data Structures Lecture 13
Dynamic Programming Louis Siu What is Dynamic Programming (DP)? Not a single algorithm A technique for speeding up algorithms (making use of.
Data Structure Introduction.
Algorithm Analysis CS 400/600 – Data Structures. Algorithm Analysis2 Abstract Data Types Abstract Data Type (ADT): a definition for a data type solely.
CPS 100, Spring Tools: Solve Computational Problems l Algorithmic techniques  Brute-force/exhaustive, greedy algorithms, dynamic programming,
Java Methods Big-O Analysis of Algorithms Object-Oriented Programming
Asymptotic Growth Rates  Themes  Analyzing the cost of programs  Ignoring constants and Big-Oh  Recurrence Relations & Sums  Divide and Conquer 
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
Scalability for Search Scaling means how a system must grow if resources or work grows –Scalability is the ability of a system, network, or process, to.
Algorithm Analysis. What is an algorithm ? A clearly specifiable set of instructions –to solve a problem Given a problem –decide that the algorithm is.
E.G.M. PetrakisAlgorithm Analysis1  Algorithms that are equally correct can vary in their utilization of computational resources  time and memory  a.
Lecture 4 Jianjun Hu Department of Computer Science and Engineerintg University of South Carolina CSCE350 Algorithms and Data Structure.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Analysis of Algorithms Spring 2016CS202 - Fundamentals of Computer Science II1.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong.
CSE 3358 NOTE SET 2 Data Structures and Algorithms 1.
1 7.Algorithm Efficiency These factors vary from one machine/compiler (platform) to another  Count the number of times instructions are executed So, measure.
Algorithmic Foundations COMP108 COMP108 Algorithmic Foundations Algorithm efficiency Prudence Wong
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
CMPT 438 Algorithms.
Theoretical analysis of time efficiency
Analysis of Algorithms
Data Structures I (CPCS-204)
Big-Oh and Execution Time: A Review
Algorithm design and Analysis
Analysis of Algorithms
Programming and Data Structure
Searching, Sorting, and Asymptotic Complexity
At the end of this session, learner will be able to:
Analysis of Algorithms
Presentation transcript:

CSCI 62 Data Structures Dr. Joshua Stough September 11, 2008

Today Generics Asymptotics –Big-O –Time/Space –Fermi Solutions –Recursion/Induction

Generics Recognize errors sooner rather later. generic – class parameterized by the type of data. Java tutorial, generics example. Vector, or Vector Class def.: public class Association Instantiation: Association personAttribute = new Assocation ("Age",34); autoboxed type parameters

Box Example – too general public class Box { private Object object; public void add(Object a) {object = a; } public Object get() { return object; } }

Box Example – forces type public class Box { private T t; // T stands for "Type" public void add(T t) { this.t = t; } public T get() { return t; } }

Asymptotics People make different design decisions. Aspects of good design: –Already know: Encapsulation Extreme: Design Patterns, Soft. Engin. –Now: Algorithmic “niceness” Space, time efficiency Tools: –Big-O –Recursion/Induction

Asymptotics Determining performance: –Counting ops Assignments/swaps Multiplications Conditionals (if’s) –Bailey says comparison inappr. Btwn architectures, but: if > mult > assignment, in general. Min, Max, Both? (in-class) –Check against variable input: Problem size Crafted devilish input – best/worst/average

Asymptotics Best/Worst/Average –Key search on unordered list: –Best case? –Worst case? –Average case?

Asymptotics Define Big-O –A function f(n) is O(g(n)) (read “order g” or “big- O of g”), if and only if there exist two positive constants, c and n0, such that |f(n)| <= c · g(n) for all n n0. –O(1), O(n), O(n^2), O(n^c), O(k^n), O(n!) –Constant, linear, quadratic, polynomial, exponential, factorial. –Assign array, sum array, matrix mult, etc.

Asymptotics Difference table – complexity grows with area public static void diffTable(int n) // pre: n >= 0 // post: print difference table of width n { for (int row = 1; row <= n; row++) // 1 { for (int col = 1; col <= n; col++) // 2 { System.out.print(row-col+" "); // 3 } System.out.println(); // 4 }

Asymptotics Multiplication Table – grows with area. public static void multTable(int n) // pre: n >= 0 // post: print multiplication table { for (int row = 1; row <= n; row++) // 1 { for (int col = 1; col <= row; col++) // 2 { System.out.print(row*col+" "); // 3 } System.out.println(); // 4 }

Asymptotics Make a list – O(n) public static Vector buildVector1(int n) // pre: n >= 0 // post: construct a vector of size n of 1..n { Vector v = new Vector (n); // 1 for (int i = n-1; i >= 0; i--) // 2 { v.add(i); // 3 } return v; // 4 }

Asymptotics Make a list – O(n^2)-appears linear Remember to consider the cost of each line. public static Vector buildVector2(int n) // pre: n >= 0 // post: construct a vector of size n of 1..n { Vector v = new Vector (n); // 1 for (int i = 0; i < n; i++) // 2 { v.add(0,i); // 3 } return v; // 4 }

Asymptotics public static Vector > factTable(int n) // pre: n > 0 // post: returns a table of factors of values 1 through n { Vector > table = new Vector >(); for (int i = 1; i <= n; i++) { Vector factors = new Vector (); for (int f = 1; f <= i; f++) { if ((i % f) == 0) { factors.add(f); } table.add(factors); } return table; } Factor Table How to make faster? Usually just a constant.

Asymptotics Interesting Java tidbits –It takes between 1 and 10 nanoseconds (ns) to store a value in Java. Basic math operations take a similar length of time. –An array assignment is approximately twice as slow as a regular assignment. –A Vector assignment is approximately 50 times slower than a regular assignment.

Recursion 3 parts: –Base case –Self-reference –Progress toward base case Examples: –Trivial: min/max/key, sum

Recursion Fibonacci - recursive public static long recursiveFibNum(int n) { if (n < 3) return 1; else return (recursiveFibNum(n - 1) + recursiveFibNum(n - 2)); }

Recursion Fibonacci - dynamic public static long dynamicFibNum(int n) { long[] holder = new long[n + 1]; for (int i = 0; i < holder.length; i++) holder[i] = 0; return dynamicRecFibNum(n, holder);} public static long dynamicRecFibNum(int n, long answer[]) { if (n < 3) return 1; if (answer[n] != 0) return answer[n]; else { answer[n] = dynamicRecFibNum(n - 1, answer) + dynamicRecFibNum(n - 2, answer); return answer[n];}}

Recursion Fibonacci - iterative public static long easyIterFibNum(int n) { int i; long[] nums = new long[3]; nums[0] = nums[1] = 1; for (i = 2; n > 1; i = (i + 1) % 3, n--) { nums[i] = nums[(i + 1) % 3] + nums[(i + 2) % 3]; } return nums[(i + 1) % 3]; }

Induction Binary trees, recursive fibonacci Inserting to the front of a Vector

Symmetry and Friction Symmetry: if there’s a set method, should be a get method. (Square, Box). 1. Compare methods that extend the structure with methods that trim the structure. Do they have similar approaches? Are they similar in number? 2. Consider methods that read and write values. Can the input methods read what is written by the output methods? Can the writing methods write all values that can be read? 3. Are procedures that consume parameters matched by functions that deliver values? 4. Can points of potential garbage collection be equally balanced by new invocations? 5. In linked structures, does unlinking a value from the structure appear to be the reverse of linking a new value into the structure?