CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists) Henry Kautz Winter 2002.

Slides:



Advertisements
Similar presentations
MATH 224 – Discrete Mathematics
Advertisements

CSE 373: Data Structures and Algorithms Lecture 5: Math Review/Asymptotic Analysis III 1.
CSE332: Data Abstractions Lecture 2: Math Review; Algorithm Analysis Tyler Robison Summer
CSE 326 Asymptotic Analysis David Kaplan Dept of Computer Science & Engineering Autumn 2001.
Introduction to Analysis of Algorithms
CS 307 Fundamentals of Computer Science 1 Asymptotic Analysis of Algorithms (how to evaluate your programming power tools) based on presentation material.
CSE 326: Data Structures Lecture #2 Analysis of Algorithms Alon Halevy Fall Quarter 2000.
Computer Science 2 Data Structures and Algorithms V section 2 Intro to “big o” Lists Professor: Evan Korth New York University 1.
Asymptotic Notations Iterative Algorithms and their analysis
1 Chapter 2 Program Performance – Part 2. 2 Step Counts Instead of accounting for the time spent on chosen operations, the step-count method accounts.
Algorithm Analysis & Complexity We saw that a linear search used n comparisons in the worst case (for an array of size n) and binary search had logn comparisons.
Program Performance & Asymptotic Notations CSE, POSTECH.
Week 2 CS 361: Advanced Data Structures and Algorithms
Mathematics Review and Asymptotic Notation
CSC 201 Analysis and Design of Algorithms Lecture 04: CSC 201 Analysis and Design of Algorithms Lecture 04: Time complexity analysis in form of Big-Oh.
Iterative Algorithm Analysis & Asymptotic Notations
Analysis of Algorithms
Analysis of Algorithms These slides are a modified version of the slides used by Prof. Eltabakh in his offering of CS2223 in D term 2013.
Asymptotic Analysis-Ch. 3
Algorithm Analysis (Algorithm Complexity). Correctness is Not Enough It isn’t sufficient that our algorithms perform the required tasks. We want them.
Program Efficiency & Complexity Analysis. Algorithm Review An algorithm is a definite procedure for solving a problem in finite number of steps Algorithm.
CMSC 341 Asymptotic Analysis. 2 Complexity How many resources will it take to solve a problem of a given size? –time –space Expressed as a function of.
1 Asymptotic Notations Iterative Algorithms and their analysis Asymptotic Notations –Big O,  Notations Review of Discrete Math –Summations –Logarithms.
DATA STRUCTURES AND ALGORITHMS Lecture Notes 2 Prepared by İnanç TAHRALI.
Algorithm Analysis Part of slides are borrowed from UST.
A Introduction to Computing II Lecture 5: Complexity of Algorithms Fall Session 2000.
CSE 326: Data Structures Lecture #3 Asymptotic Analysis Steve Wolfman Winter Quarter 2000.
Big O David Kauchak cs302 Spring Administrative Assignment 1: how’d it go? Assignment 2: out soon… Lab code.
1 Chapter 2 Algorithm Analysis All sections. 2 Complexity Analysis Measures efficiency (time and memory) of algorithms and programs –Can be used for the.
1 Chapter 2 Algorithm Analysis Reading: Chapter 2.
Data Structures I (CPCS-204) Week # 2: Algorithm Analysis tools Dr. Omar Batarfi Dr. Yahya Dahab Dr. Imtiaz Khan.
Algorithm Analysis 1.
CMPT 438 Algorithms.
Chapter 2 Algorithm Analysis
Introduction to Analysis of Algorithms
Analysis of Algorithms
Searching – Linear and Binary Searches
Introduction to complexity
Introduction to Algorithms
CSE 373: Data Structures and Algorithms Pep Talk; Algorithm Analysis
CSE373: Data Structures and Algorithms Lecture 3: Math Review; Algorithm Analysis Catie Baker Spring 2015.
Growth of functions CSC317.
Algorithm Analysis CSE 2011 Winter September 2018.
Computation.
Big-Oh and Execution Time: A Review
Algorithm Analysis (not included in any exams!)
CSE373: Data Structures and Algorithms Lecture 4: Asymptotic Analysis
Algorithm design and Analysis
Introduction to Algorithms Analysis
CSE373: Data Structures and Algorithms Lecture 3: Asymptotic Analysis
Data Structures Review Session
Defining Efficiency Asymptotic Complexity - how running time scales as function of size of input Two problems: What is the “input size” ? How do we express.
CS 201 Fundamental Structures of Computer Science
DS.A.1 Algorithm Analysis Chapter 2 Overview
PAC Intro to “big o” Lists Professor: Evan Korth New York University
Analysis of Algorithms II
CSE 373 Data Structures Lecture 5
Searching, Sorting, and Asymptotic Complexity
Searching, Sorting, and Asymptotic Complexity
CSE 373: Data Structures & Algorithms
CSE 373 Data Structures and Algorithms
Mathematical Background 2
CSE 373 Optional Section Led by Yuanwei, Luyi Apr
Performance Evaluation
CSE 326: Data Structures Lecture #6 Asymptotic Analysis
David Kauchak cs161 Summer 2009
CS210- Lecture 2 Jun 2, 2005 Announcements Questions
Estimating Algorithm Performance
CMPT 225 Lecture 6 – Review of Complexity Analysis using the Big O notation + Comparing List ADT class implementations.
Presentation transcript:

CSE 326: Data Structures Lecture #2 Analysis of Algorithms I (And A Little More About Linked Lists) Henry Kautz Winter 2002

Assignment #1 Goals of this assignment: Introduce the ADTs (abstract data types) for lists and sparse vectors, motivated by an application to information retrieval. Show the connection between the empirical runtime scaling of an algorithm and formal asymptotic complexity Gain experience with the Unix tools g++, make, gnuplot, csh, and awk. Learn how to use templates in C++. We will use new g++ version 3.0 compiler – does templates right!

Implementing Linked Lists in C (optional header) (a b c) a b c  L struct node{ Object element; struct node * next; } Everything else is a pointer to a node! typedef stuct node * List; typedef struct node * Position;

Implementing in C++ (optional (a b c) header)  Create separate classes for Node List (contains a pointer to the first node) List Iterator (specifies a position in a list; basically, just a pointer to a node) Pro: syntactically distinguishes uses of node pointers Con: a lot of verbage! Also, is a position in a list really distinct from a list?

Structure Sharing L = (a b c) M = (b c)  Important technique for conserving memory usage in large lists with repeated structure Used in many recursive algorithms on lists

Implementing Linked Lists Using Arrays 1 2 3 4 5 6 7 8 9 10 Data F O A R N R T Next 3 8 6 4 -1 10 5 First = 2 “Cursor implementation” Ch 3.2.8 Can use same array to manage a second list of unused cells

Problem? List ADT Polynomial ADT ( 5 2 3 ) ( 7 8 ) ( 3 0 2 ) Ai is the coefficient of the xi-1 term: 5 + 2x + 3x2 ( 5 2 3 ) 7 + 8x ( 7 8 ) Here’s an application of the list abstract data type as a _data structure_ for another abstract data type. Is there a problem here? Why? 3 + x2 ( 3 0 2 ) Problem?

4 + 3x2001 ( 4 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0 3 ) What is it about lists that makes this a problem here and not in stacks and queues? (Answer: kth(int)!) Is there a solution? Will we get anything but zeroes overwhelming this data structure?

Sparse List Data Structure: 4 + 3x2001 (<4 0> <2001 3>) 4 3 2001 This slide is made possible in part by the sparse list data structure. Now, two questions: 1) Is a sparse list really a data structure or an abstract data type? (Answer: It depends but I lean toward data structure. YOUR ANSWER MUST HAVE JUSTIFICATION!) 2) Which list data structure should we use to implement it? Linked Lists or Arrays?

Addition of Two Polynomials? 15+10x50+3x1200 p 15 10 50 3 1200 5+30x50+4x100 q 5 30 50 4 100

Addition of Two Polynomials Similar to merging two sorted lists – one pass! 15+10x50+3x1200 p 15 10 50 3 1200 5+30x50+4x100 q 5 30 50 4 100 r 20 40 50 4 100 3 1200

To ADT or NOT to ADT? Issue: when to bypass / expand List ADT? Using general list operations: reverse(x) { y = new list; while (! x.empty()){ /* remove 1st element from x, insert in y */ y.insert_after_kth( x.kth(1), 0); x.delete_kth(1); } return y; } Disadvantages?

Destructive LL Version a b c  x a  b c x reverse(node * x) { last = NULL; while (x->next != NULL){ tmp = x->next; x->next = last; last = x; x = tmp;} return x; } Faster in practice? Asymptotically faster?

Analysis of Algorithms Analysis of an algorithm gives insight into how long the program runs and how much memory it uses time complexity space complexity Why useful? Input size is indicated by a number n sometimes have multiple inputs, e.g. m and n Running time is a function of n n, n2, n log n, 18 + 3n(log n2) + 5n3

Simplifying the Analysis Eliminate low order terms 4n + 5  4n 0.5 n log n - 2n + 7  0.5 n log n 2n + n3 + 3n  2n Eliminate constant coefficients 4n  n 0.5 n log n  n log n log n2 = 2 log n  log n log3 n = (log3 2) log n  log n We didn’t get very precise in our analysis of the UWID info finder; why? Didn’t know the machine we’d use. Is this always true? Do you buy that coefficients and low order terms don’t matter? When might they matter? (Linked list memory usage)

Order Notation BIG-O T(n) = O(f(n)) OMEGA T(n) =  (f(n)) Upper bound Exist constants c and n’ such that T(n)  c f(n) for all n  n’ OMEGA T(n) =  (f(n)) Lower bound T(n)  c f(n) for all n  n0 THETA T(n) = θ (f(n)) Tight bound θ(n) = O(n) =  (n) We’ll use some specific terminology to describe asymptotic behavior. There are some analogies here that you might find useful.

Examples n2 + 100 n = O(n2) because n2 + 100 n = (n2) because ( n2 + 100 n )  2 n2 for n  10 n2 + 100 n = (n2) because ( n2 + 100 n )  1 n2 for n  0 Therefore: n2 + 100 n = (n2)

Notation Gotcha Order notation is not symmetric; write but never 2n2 + 4n = O(n2) but never O(n2) = 2n2 + 4n right hand side is a crudification of the left Likewise O(n2) = O(n3) (n3) = (n2)

Mini-Quiz 5n log n = O(n2) 5n log n = (n2) 5n log n = O(n) 5n log n = (n log n)

Silicon Downs Post #2 Post #1 100n2 + 1000 n3 + 2n2 log n n0.1 Welcome, everyone, to the Silicon Downs. I’m getting race results as we stand here. Let’s start with the first race. I’ll have the first row bet on race #1. Raise your hand if you bet on function #1 (the jockey is n^0.1) So on. Show the race slides after each race.

Race I n3 + 2n2 vs. 100n2 + 1000

Race II n0.1 vs. log n Well, log n looked good out of the starting gate and indeed kept on looking good until about n^17 at which point n^0.1 passed it up forever. Moral of the story? N^epsilon beats log n for any eps > 0. BUT, which one of these is really better?

Race III n + 100n0.1 vs. 2n + 10 log n Notice that these just look like n and 2n once we get way out. That’s because the larger terms dominate. So, the left is less, but not asymptotically less. It’s a TIE!

Race IV 5n5 vs. n! N! is BIG!!!

Race V n-152n/100 vs. 1000n15 No matter how you put it, any exponential beats any polynomial. It doesn’t even take that long here (~250 input size)

Race VI 82log(n) vs. 3n7 + 7n We can reduce the left hand term to n^6, so they’re both polynomial and it’s an open and shut case.

The Losers Win Better algorithm! O(n2) O(log n) TIE O(n) O(n5) O(n15) Post #1 n3 + 2n2 n0.1 n + 100n0.1 5n5 n-152n/100 82log n Post #2 100n2 + 1000 log n 2n + 10 log n n! 1000n15 3n7 + 7n Welcome, everyone, to the Silicon Downs. I’m getting race results as we stand here. Let’s start with the first race. I’ll have the first row bet on race #1. Raise your hand if you bet on function #1 (the jockey is n^0.1) So on. Show the race slides after each race.

Common Names constant: O(1) logarithmic: O(log n) linear: O(n) log-linear: O(n log n) superlinear: O(n1+c) (c is a constant > 0) quadratic: O(n2) polynomial: O(nk) (k is a constant) exponential: O(cn) (c is a constant > 1) Well, it turns out that the old Silicon Downs is fixed. They dope up the horses to make the first few laps interesting, but we can always find out who wins. Here’s a chart comparing some of the functions. Notice that any exponential beats any polynomial. Any superlinear beats any poly-log-linear. Also keep in mind (though I won’t show it) that sometimes the input has more than one parameter. Like if you take in two strings. In that case you need to be very careful about what is constant and what can be ignored. O(log m + 2n) is not necessarily O(2n)

Analyzing Code C++ operations - constant time consecutive stmts - sum of times conditionals - sum of branches, condition loops - sum of iterations function calls - cost of function body recursive functions - solve recursive equation Alright, I want to do some code examples, but first, how will we examine code? These are just rules of thumb, not the way you always do it. Sometimes, for ex, it’s important to keep track of which branch of a conditional is followed when (like when analyzing recursive functions). Above all, use your head!

Conditionals Conditional time  time(C) + MAX( time(S1), time(S2) ) if C then S1 else S2 time  time(C) + MAX( time(S1), time(S2) ) OK, so this isn’t exactly an example. Just reiterating the rule. Time <= time of C plus max of S1 and S2 <= time of C plus S1 plus S2 time <= sum of times of iterations often #of iterations * time of S (or worst time of S)

Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1 I spent 5 minutes looking at my notes trying to figure out why this 1 was an i when it wasn’t. So, I’m putting variables in blue italics. This example is pretty straightforward. Each loop goes N times, constant amount of work on the inside. N*N*1 = O(N^2)

Nested Loops for i = 1 to n do for j = 1 to n do sum = sum + 1 I spent 5 minutes looking at my notes trying to figure out why this 1 was an i when it wasn’t. So, I’m putting variables in blue italics. This example is pretty straightforward. Each loop goes N times, constant amount of work on the inside. N*N*1 = O(N^2)

Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2)

Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2) ?

Nested Dependent Loops for i = 1 to n do for j = i to n do sum = sum + 1 There’s a little twist here. J goes from I to N, not 1 to N. So, let’s do the sums inside is constant. Next loop is sum I to N of 1 which equals N - I + 1 Outer loop is sum 1 to N of N - I + 1 That’s the same as sum N to 1 of I or N(N+1)/2 or O(N^2)

To Do Finish reading Ch 1 and 2 Start reading Ch 3 Get started on assignment #1!