Incremental Computation AJ Shankar CS265 Spring 2003 Expert Topic.

Slides:



Advertisements
Similar presentations
Compiler Construction
Advertisements

Introduction to Algorithms Quicksort
CHAPTER 2 ALGORITHM ANALYSIS 【 Definition 】 An algorithm is a finite set of instructions that, if followed, accomplishes a particular task. In addition,
Data Dependencies Describes the normal situation that the data that instructions use depend upon the data created by other instructions, or data is stored.
Tail Recursion. Problems with Recursion Recursion is generally favored over iteration in Scheme and many other languages – It’s elegant, minimal, can.
Inpainting Assigment – Tips and Hints Outline how to design a good test plan selection of dimensions to test along selection of values for each dimension.
MS 101: Algorithms Instructor Neelima Gupta
8 Algorithms Foundations of Computer Science ã Cengage Learning.
Quick Sort, Shell Sort, Counting Sort, Radix Sort AND Bucket Sort
September 12, Algorithms and Data Structures Lecture III Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Recursion Great fleas have little fleas upon their backs to bite 'em, And little fleas have lesser fleas, and so ad infinitum. And the great fleas themselves,
Sorting Algorithms. Motivation Example: Phone Book Searching Example: Phone Book Searching If the phone book was in random order, we would probably never.
Advanced Topics in Algorithms and Data Structures 1 Rooting a tree For doing any tree computation, we need to know the parent p ( v ) for each node v.
Algorithm Design Techniques: Induction Chapter 5 (Except Section 5.6)
CSE 830: Design and Theory of Algorithms
CS 206 Introduction to Computer Science II 10 / 14 / 2009 Instructor: Michael Eckmann.
CS 106 Introduction to Computer Science I 03 / 28 / 2008 Instructor: Michael Eckmann.
Computer Science 1620 Programming & Problem Solving.
Copyright © Cengage Learning. All rights reserved. CHAPTER 11 ANALYSIS OF ALGORITHM EFFICIENCY ANALYSIS OF ALGORITHM EFFICIENCY.
Recursion Chapter 7. Chapter 7: Recursion2 Chapter Objectives To understand how to think recursively To learn how to trace a recursive method To learn.
Advanced Topics in Algorithms and Data Structures 1 Two parallel list ranking algorithms An O (log n ) time and O ( n log n ) work list ranking algorithm.
Recursion.
Data Structures/ Algorithms and Generic Programming Sorting Algorithms.
Tail Recursion. Problems with Recursion Recursion is generally favored over iteration in Scheme and many other languages – It’s elegant, minimal, can.
Data Structures Using C++ 2E Chapter 6 Recursion.
Compiler Code Optimizations. Introduction Introduction Optimized codeOptimized code Executes faster Executes faster efficient memory usage efficient memory.
Topic #10: Optimization EE 456 – Compiling Techniques Prof. Carl Sable Fall 2003.
Algorithmic Problem Solving CMSC 201 Adapted from slides by Marie desJardins (Spring 2015 Prof Chang version)
Data Structures Using C++ 2E Chapter 6 Recursion.
Department of Computer Science and Engineering, HKUST 1 HKUST Summer Programming Course 2008 Recursion.
Recursion, Complexity, and Searching and Sorting By Andrew Zeng.
A Computer Science Tapestry 1 Recursion (Tapestry 10.1, 10.3) l Recursion is an indispensable technique in a programming language ä Allows many complex.
CMSC 2021 Recursion Recursive Definition – one that defines something in terms of itself Recursion – A technique that allows us to break down a problem.
1 Decrease-and-Conquer Approach Lecture 06 ITS033 – Programming & Algorithms Asst. Prof. Dr. Bunyarit Uyyanonvara IT Program, Image and Vision Computing.
Recursion Chapter 7. Chapter Objectives  To understand how to think recursively  To learn how to trace a recursive method  To learn how to write recursive.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
Lecturer: Dr. AJ Bieszczad Chapter 11 COMP 150: Introduction to Object-Oriented Programming 11-1 l Basics of Recursion l Programming with Recursion Recursion.
Stephen P. Carl - CS 2421 Recursion Reading : Chapter 4.
Recursion Textbook chapter Recursive Function Call a recursive call is a function call in which the called function is the same as the one making.
Chapter 13 Recursion. Learning Objectives Recursive void Functions – Tracing recursive calls – Infinite recursion, overflows Recursive Functions that.
CS 206 Introduction to Computer Science II 02 / 23 / 2009 Instructor: Michael Eckmann.
CSC 221: Recursion. Recursion: Definition Function that solves a problem by relying on itself to compute the correct solution for a smaller version of.
Edited by Malak Abdullah Jordan University of Science and Technology Data Structures Using C++ 2E Chapter 6 Recursion.
Java Programming: Guided Learning with Early Objects Chapter 11 Recursion.
CS 361 – Chapters 8-9 Sorting algorithms –Selection, insertion, bubble, “swap” –Merge, quick, stooge –Counting, bucket, radix How to select the n-th largest/smallest.
Lecture 7. Solution by Substitution Method T(n) = 2 T(n/2) + n Substitute n/2 into the main equation 2T(n/2) = 2(2(T(n/4)) + n/2) = 4T(n/4) + n And T(n)
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
Chapter 11Java: an Introduction to Computer Science & Programming - Walter Savitch 1 Chapter 11 l Basics of Recursion l Programming with Recursion Recursion.
Learning to Detect Faces A Large-Scale Application of Machine Learning (This material is not in the text: for further information see the paper by P.
Principles of Programming - NI Simple Recursion Recursion is where a function calls itself. Concept of recursive function: A recursive function is.
8.1 8 Algorithms Foundations of Computer Science  Cengage Learning.
Chapter 9 Recursion © 2006 Pearson Education Inc., Upper Saddle River, NJ. All rights reserved.
CS412/413 Introduction to Compilers and Translators April 2, 1999 Lecture 24: Introduction to Optimization.
CS162 - Topic #10 Lecture: Recursion –The Nature of Recursion –Tracing a Recursive Function –Work through Examples of Recursion Programming Project –Discuss.
Data Structures and Algorithms Instructor: Tesfaye Guta [M.Sc.] Haramaya University.
Given a node v of a doubly linked list, we can easily insert a new node z immediately after v. Specifically, let w the be node following v. We execute.
Chapter 15 Running Time Analysis. Topics Orders of Magnitude and Big-Oh Notation Running Time Analysis of Algorithms –Counting Statements –Evaluating.
MA/CSSE 473 Day 05 More induction Factors and Primes Recursive division algorithm.
CS 116 Object Oriented Programming II Lecture 13 Acknowledgement: Contains materials provided by George Koutsogiannakis and Matt Bauer.
Recursion Great fleas have little fleas upon their backs to bite 'em, And little fleas have lesser fleas, and so ad infinitum. And the great fleas themselves,
User-Defined Functions
Recursion Great fleas have little fleas upon their backs to bite 'em, And little fleas have lesser fleas, and so ad infinitum. And the great fleas themselves,
Announcements Final Exam on August 17th Wednesday at 16:00.
Bubble Sort Bubble sort is one way to sort an array of numbers. Adjacent values are swapped until the array is completely sorted. This algorithm gets its.
Binary Search Back in the days when phone numbers weren’t stored in cell phones, you might have actually had to look them up in a phonebook. How did you.
Compiler Code Optimizations
Recursion Great fleas have little fleas upon their backs to bite 'em, And little fleas have lesser fleas, and so ad infinitum. And the great fleas themselves,
Recursion Great fleas have little fleas upon their backs to bite 'em, And little fleas have lesser fleas, and so ad infinitum. And the great fleas themselves,
Optimization 薛智文 (textbook ch# 9) 薛智文 96 Spring.
Presentation transcript:

Incremental Computation AJ Shankar CS265 Spring 2003 Expert Topic

The idea Sometimes a given computation is performed many times in succession on inputs that differ only slightly from each other Let’s optimize this situation by trying to use the previous output of the computation in computing the current one Iteration and recursion are special cases of “repeated computation” – so lots of gains to be had

A simple example Compute the successive sums of each m- element window in an n-element array (with m < n) for (int i = 0 ; i < n-m ; i++) { for (int j = i ; j < m ; j++) { sum[i] += ary[j]; } This is an O(n 2 ) algorithm.

A simple example, continued Consider a four-element window. The naïve approach adds up each group of four numbers from scratch. However, all we need to do to compute a new window is subtract the number that is leaving the window and add the number that is entering it!

A simple example, continued So we can incrementally compute each successive window as follows: // compute the first window from scratch for (int j = 0 ; j < m ; j++) { sum[0] += ary[j]; } // incrementally compute each successive window for (int i = 1 ; i < n-m ; i++) { sum[i] = sum[i-1] - ary[i-1] + ary[i+m-1]; } This new algorithm is O(n)!

Our goal Let the user write natural, maintainable code Discover repeated computations that can be “incrementalized” Generate an incremental version of these computations that is faster than the original Do this as automatically as possible

What this problem is not Incremental algorithms: algorithms explicitly designed to accommodate incremental changes to their inputs. We’d like to study the automated incrementalization of existing code. Incremental model of computation: code is rendered incremental at run-time via function caching, etc. Relies on a run-time mechanism and therefore never explicitly constructs incremental code that can be run by conventional means.

Some history First introduced by Early in 1974 as ‘iterator inversion’ Explored further by Paige, Schwartz, and Koenig (1977, 1982) Very high level (sets) Called ‘formal differentiation’ For the first time, discussed the possible automation of the algorithm Strength reduction More specific notion of replacing costly operations with cheap ones (say * by +)

Incremental computation Described in a series of papers by Yanhong Annie Liu et. al. through the 90s First systematic approach to incrementalizing programs written in a common functional language Proof of correctness (not covered here)

Formal definition Let f be a program and let x be an input to f. Let y be a change in the value of x, and let  be a change operation that combines x and y to produce a new input value x  y. Let r = f(x): the result of executing f on x. Let f’(x,y,r) be a program that computes f(x  y) such that 1.f’ computes faster than f for almost all x and y 2.f’ makes non-trivial use of r Then f’ is an incremental version of f.

An illustrative case Let sort(x) be selection sort; it takes O(n 2 ) time for x of length n Let sort’(x,y,r) – where r is sort(x) – compute sort(cons(x,y)) by running merge sort on cons(x,y); takes O(n log n) time. Not incremental! Let sort’’(x,y,r) compute sort(cons(x,y)) by inserting y in r in O(n) time. Non- trivial use of r; hence, incremental.

A general systematic transformational approach Given f and , derive an incremental program that computes f(x  y) using 1. The value of f(x) (Liu, Teitelbaum 1993) 2. The intermediate results of f(x) (Liu, Teitelbaum 1995) 3. Auxiliary information of f(x) (Liu, Stoller, Teitelbaum 1996) Each successive class of information allows for greater incrementality than the previous one

P1. Using the previous result 1. Given f(x), introduce f’(x,y,r) 2. Unfold Expand f using the definition of the  operator 3. Simplify Use basic rewrite rules like car(cons(a,b)) = a 4. Replace using cached result Substitute r when we see f(x) in the expanded function 5. Eliminate dead code

An example sum(x) = if null(x) then 0 else car(x) + sum(cdr(x)) x  y = cons(y, x) 1.Introduce f’ 2.Unfold 3.Simplify 4.Replace 5.Eliminate sum’(x,y,r) = sum(cons(y,x)) x  y = cons(y, x) sum’(x,y,r) = if null(cons(y,x)) then 0 else car(cons(y,x)) + sum(cdr(cons(y,x))) x  y = cons(y, x) sum’(x,y,r) = if (false) then 0 else y + sum(x) x  y = cons(y, x) sum’(x,y,r) = y + r x  y = cons(y, x) sum’(y,r) = y + r x  y = cons(y, x)  sum(cons(y,x)) takes O(n) time  sum’(y,r) takes O(1) time and one unit of space

P2. Using intermediate results In computing f(x), we might calculate some intermediate results that would be useful in computing f(x  y) but are not retrievable from r Recall the successive sums problem: f(2..6) = 6 + f(2..5) // intermediate result from f(1..5) So let’s keep track of all the intermediate results …But there might be a ton of them!

The cache-and-prune method Stage 1: Construct f* that extends f to return r and all intermediate results Stage 2: Incrementalize f* to get f*’ as per P1 Stage 3: Figure out which results are necessary and prune out the rest from f*’, yielding f^’

Our old friend, Fibonacci fib(x) = if x  1 then 1 else fib(x-1) + fib(x-2)

Fibonacci, continued Note that the standard P1 method will not work – we still have fib(x-2) So, cache-and-prune: Stage 1: Construct a function that, if run, would return an exponential-sized tree Stage 2: Incrementalize this function, noticing that both fib(x-1) and fib(x-2) can be retrieved from the cached tree Stage 3: Remove the other unnecessary results (the rest of the tree)

Fibonacci, continued The final incrementalized version of fib(x) is fib^’(x) = if x  1 then // pair else if x = 2 then else let r = fib^’(x-1) in The old fib(x) took O(2 n ) time, whereas this version takes O(n) time.

P3. Auxiliary information Let’s go even further and discover information that would be useful for computing f(x  y) but that is never computed in f(x) Two-phase method: Identify computations in f(x  y) done only on x that cannot be retrieved from any existing cached data Determine whether such information would aid in the efficient computation of f(x  y); if so, compute and store it Most of this can be done using techniques from P1 and P2, respectively

A (complicated) example cmp(x) = sum(odd(x))  prod(even(x)) x  y = cons(y, x)  When we add y, the odd and even sublists are swapped – we must now take the product of what we used to sum and vice-versa  Therefore, the results (even intermediate ones) of the previous computation are useless!  So we really want to compute and save the values of sum(even(x)) and prod(odd(x)) too, which can be done with a single addition or multiplication each  This is auxiliary information (see board)

Unrolling cmp cmp(x) = sum(odd(x)) <= prod(even(x)) odd(x) = if null(x) then nil else cons(car(x), even(cdr(x))) even(x) = if null(x) then nil else odd(cdr(x)) Unroll… cmp(cons(y,x)) = sum(odd(cons(y,x)))  prod(even(cons(y,x))) cmp(cons(y,x)) = sum( if null(cons(y,x)) nil else cons(car(cons(y,x)), even(cdr(cons(y,x)))) )  prod( if null(cons(y,x)) then nil else odd(cdr(cons(y,x))) )

Simplify… cmp(cons(y,x)) = sum(cons(y,even(x))  prod(odd(x)) Identify even(x) and odd(x) as computations that only depend on x that can be incrementalized. cmp*(x) = let v 1 = odd(x), u 1 = sum(v 1 ), v 2 = even(x), u 2 = prod(v 2 ) in cmp’(y,r) = Optimizing cmp, continued ressum(odd)prod(even)sum(even)prod(odd)

Further work CACHET (1996) An interactive programming environment that derives incremental programs from non-incremental ones Transformations directly manipulate the program tree Use annotations to preserve user-specified stuff and to give direction to the optimizer Basically a proof of concept

Further work Using incrementalization to transform general recursion into iteration (1999) Find base and recursive cases of f For each recursive case, identify an input increment (f(x) = 2*f(x-1))and derive an incremental version Form the iterative program using some generic iteration constructs as appropriate

Recursion to iteration, con’t The tail recursion optimization may in fact produce slower code than the original recursive function! Multiplying small numbers is faster than multiplying large ones, etc. So far we can generate an additional function that computes an incremental result given a previous result r This work handles the inlining of the iterative computations, including the hairy bits with multiple base and recursive cases Use associativity, loop contraction, redundant test elimination, pointer reversal