Presentation is loading. Please wait.

Presentation is loading. Please wait.

Self-Adjusting Computation Umut Acar Carnegie Mellon University Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo.

Similar presentations


Presentation on theme: "Self-Adjusting Computation Umut Acar Carnegie Mellon University Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo."— Presentation transcript:

1 Self-Adjusting Computation Umut Acar Carnegie Mellon University Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo

2 14 January 2004Workshop on Dynamic Algorithms and Applications2 Dynamic Algorithms Maintain their input-output relationship as the input changes Example: A dynamic MST algorithm maintains the MST of a graph as user to insert/delete edges Useful in many applications involving interactive systems, motion,...

3 14 January 2004Workshop on Dynamic Algorithms and Applications3 Developing Dynamic Algorithms: Approach I Dynamic by design Many papers Agarwal, Atallah, Bash, Bentley, Chan, Cohen, Demaine, Eppstein, Even, Frederickson, Galil, Guibas, Henzinger, Hershberger, King, Italiano, Mehlhorn, Overmars, Powell, Ramalingam, Roditty, Reif, Reps, Sleator, Tamassia, Tarjan, Thorup, Vitter,... Efficient algorithms but can be complex

4 14 January 2004Workshop on Dynamic Algorithms and Applications4 Approach II: Re-execute the algorithm when the input changes Very simple General Poor performance

5 14 January 2004Workshop on Dynamic Algorithms and Applications5 Smart re-execution Suppose we can identify the pieces of execution affected by the input change Re-execute by re-building only the affected pieces Execution (A,I) Execution (A,I+  )

6 14 January 2004Workshop on Dynamic Algorithms and Applications6 Smart Re-execution Time re-execute = O(distance between executions) Execution (A,I) Execution (A,I+  )

7 14 January 2004Workshop on Dynamic Algorithms and Applications7 Incremental Computation or Dynamization General techniques for transforming algorithms dynamic Many papers: Alpern, Demers, Field, Hoover, Horwitz, Hudak, Liu, de Moor, Paige, Pugh, Reps, Ryder, Strom, Teitelbaum, Weiser, Yellin... Most effective techniques are Static Dependence Graphs [Demers, Reps, Teitelbaum ‘81] Memoization [Pugh, Teitelbaum ‘89] These techniques work well for certain problems

8 14 January 2004Workshop on Dynamic Algorithms and Applications8 Bridging the two worlds Dynamization simplifies development of dynamic algorithms but generally yields inefficient algorithms Algorithmic techniques yield good performance Can we have the best of the both worlds?

9 14 January 2004Workshop on Dynamic Algorithms and Applications9 Our Work Dynamization techniques: Dynamic dependence graphs [Acar,Blelloch,Harper ‘02] Adaptive memoization [Acar, Blelloch,Harper ‘04] Stability: Technique for analyzing performance [ABHVW ‘04] Provides a reduction from dynamic to static problems Reduces solving a dynamic problem to finding a stable solution to the corresponding static problem Example: Dynamizing parallel tree contraction algorithm [Miller, Reif 85] yields an efficient solution to the dynamic trees problem [Sleator, Tarjan ‘83], [ABHVW SODA 04]

10 14 January 2004Workshop on Dynamic Algorithms and Applications10 Outline Dynamic Dependence Graphs Adaptive Memoization Applications to Sorting Kinetic Data Structures with experimental results Retroactive Data Structures

11 14 January 2004Workshop on Dynamic Algorithms and Applications11 Control dependences arise from function calls Dynamic Dependence Graphs

12 14 January 2004Workshop on Dynamic Algorithms and Applications12 Control dependences arise from function calls Data dependences arise from reading/writing the memory Dynamic Dependence Graphs a b c

13 14 January 2004Workshop on Dynamic Algorithms and Applications13 Change Propagation Change propagation a bc a b c

14 14 January 2004Workshop on Dynamic Algorithms and Applications14 Change Propagation Change propagation a bc a b c

15 14 January 2004Workshop on Dynamic Algorithms and Applications15 Change Propagation Change propagation with Memoization a bc a b c

16 14 January 2004Workshop on Dynamic Algorithms and Applications16 Change Propagation a bc a b c Change propagation with Memoization

17 14 January 2004Workshop on Dynamic Algorithms and Applications17 Change Propagation Change propagation with Memoization a bc a b c

18 14 January 2004Workshop on Dynamic Algorithms and Applications18 Change Propagation Change Propagation with Adaptive Memoization a bc a b c

19 14 January 2004Workshop on Dynamic Algorithms and Applications19 The Internals 1. Order Maintenance Data Structure [Dietz, Sleator ‘87] Time stamp vertices of the DDG in sequential execution order 2. Priority queue for change propagation priority = time stamp Re-execute functions in sequential execution order Ensures that a value is updated before being read 3. Hash tables for memoization Remember results from the previous execution only 4. Constant-time equality tests

20 14 January 2004Workshop on Dynamic Algorithms and Applications20 Standard Quicksort fun qsort (l) = let fun qs (l,rest) = case l of NIL => rest | CONS(h,t) => let (smaller, bigger) = split(h,t) sbigger = qs (bigger,rest) in qs (smaller, CONS(h,sbigger)) end in qs(l,NIL) end

21 14 January 2004Workshop on Dynamic Algorithms and Applications21 Dynamic Quicksort fun qsort (l) = let fun qs (l,rest,d) = read(l, fn l' => case l' of NIL => write (d, rest) | CONS(h,t) => let (less,bigger) = split (h,t) sbigger = mod (fn d => qs(bigger,rest,d)) in qs(less,CONS(h,sbigger,d)) end in mod mod (fn d => qs (l,NIL,d)) end

22 14 January 2004Workshop on Dynamic Algorithms and Applications22 Performance of Quicksort Dynamized Quicksort updates its output in expected O(logn) time for insertions/deletions at the end of the input O(n) time for insertions/deletions at the beginning of the input O(logn) time for insertions/deletions at a random location Other Results for insertions/deletions anywhere in the input Dynamized Mergesort: expected O(logn) Dynamized Insertion Sort: expected O(n) Dynamized minimum/maximum/sum/...: expected O(logn)

23 14 January 2004Workshop on Dynamic Algorithms and Applications23 Function Call Tree for Quicksort

24 14 January 2004Workshop on Dynamic Algorithms and Applications24 Function Call Tree for Quicksort

25 14 January 2004Workshop on Dynamic Algorithms and Applications25 Function Call Tree for Quicksort

26 14 January 2004Workshop on Dynamic Algorithms and Applications26 Insertion at the end of the input

27 14 January 2004Workshop on Dynamic Algorithms and Applications27 Insertion in the middle

28 14 January 2004Workshop on Dynamic Algorithms and Applications28 Insertion in the middle

29 14 January 2004Workshop on Dynamic Algorithms and Applications29 Insertion at the start, in linear time 15 130 5 3 2635 4616279 Input: 15,30,26,1,5,16,27,9,3,35,46

30 14 January 2004Workshop on Dynamic Algorithms and Applications30 Insertion at the start, in linear time 15 1 30 5 3 2635 4627 9 20 16 15 130 5 3 2635 4616279 Input: 20,15,30,26,1,5,16,27,9,3,35,46

31 14 January 2004Workshop on Dynamic Algorithms and Applications31 Kinetic Data Structures [Basch,Guibas,Herschberger ‘99] Goal: Maintain properties of continuously moving objects Example: A kinetic convex-hull data structure maintains the convex hull of a set of continuously moving objects

32 14 January 2004Workshop on Dynamic Algorithms and Applications32 Kinetic Data Structures Run a static algorithm to obtain a proof of the property Certificate = Comparison + Failure time Insert the certificates into a priority queue Priority = Failure time A framework for handling motion [Guibas, Karavelas, Russel, ALENEX 04] while queue  empty do { certificate = remove (queue) flip (certificate) update the certificate set (proof) }

33 14 January 2004Workshop on Dynamic Algorithms and Applications33 Kinetic Data Structures via Self-Adjusting Computation Update the proof automatically with change propagation A library for kinetic data structures [Acar, Blelloch, Vittes] Quicksort: expected O(1), Mergesort: expected O(1) Quick Hull, Chan’s algorithm, Merge Hull: expected O(logn) while queue  empty do { certificate = remove (queue) flip (certificate) propagate ()}

34 14 January 2004Workshop on Dynamic Algorithms and Applications34 Quick Hull: Find Min and Max A C B D F G H I E J M K L N O P [A B C D E F G H I J K L M N O P]

35 14 January 2004Workshop on Dynamic Algorithms and Applications35 Quick Hull: Furthest Point A C B D F G H I E J M K L N O P [A B D F G H J K M O P]

36 14 January 2004Workshop on Dynamic Algorithms and Applications36 Quick Hull: Filter A C B D F G H I E J M K L N O P [ [A B F J ] [J O P] ]

37 14 January 2004Workshop on Dynamic Algorithms and Applications37 Quick Hull: Find left hull A C B D F G H I E J M K L N O P [ [A B] [B J] [J O] [O P] ]

38 14 January 2004Workshop on Dynamic Algorithms and Applications38 Quick Hull: Done A C B D F G H I E J M K L N O P [ [A B] [B J] [J O] [O P] ]

39 14 January 2004Workshop on Dynamic Algorithms and Applications39 Static Quick Hull fun findHull(line as (p1,p2),l,hull) = let pts = filter l (fn p => Geo.lineside(p,line)) in case pts of EMPTY => CONS(p1, hull) | _ => let pm = max (Geo.dist line) l left = findHull((pm,p2),l,hull,dest) full = findHull((p1,pm),l,left) in full end end fun quickHull l = let (mx,xx) = minmax (Geo.minX, Geo.maxX) l in findHull((mx,xx),points,CONS(xx,NIL) end

40 14 January 2004Workshop on Dynamic Algorithms and Applications40 Kinetic Quick Hull fun findHull(line as (p1,p2),l,hull,dest) = let pts = filter l(fn p => Kin.lineside(p,line)) in modr (fn dest => read l (fn l => case l of NIL => write(dest,CONS(p1, hull)) | _ => read (max (Kin.dist line) l) (fn pm => let gr = modr (fn d => findHull((pm,p2),l,hull,d)) in findHull((p1,pm),l,gr,dest)))) end end fun quickHull l = let (mx,xx) = minmax (Kin.minX, Kin.maxX) l in modr(fn d => read (mx,xx)(fn (mx,xx) => split ((mx,xx),l, CONS(xx,NIL),d)))) end

41 14 January 2004Workshop on Dynamic Algorithms and Applications41 Kinetic Quick Hull Input size Certificates /Event

42 14 January 2004Workshop on Dynamic Algorithms and Applications42 Dynamic and Kinetic Changes Often interested in dynamic as well as kinetic changes Insert and delete objects Change the motion plan, e.g., direction, velocity Easily programmed via self-adjusting computation Example: Kinetic Quick Hull code is both dynamic and kinetic Batch changes Real time changes: Can maintain partially correct data structures (stop propagation when time expires)

43 14 January 2004Workshop on Dynamic Algorithms and Applications43 Retroactive Data Structures [Demaine, Iacono, Langerman ‘04] Can change the sequence of operations performed on the data structure Example: A retroactive queue would allow the user to go back in time and insert/remove an item

44 14 January 2004Workshop on Dynamic Algorithms and Applications44 Retroactive Data Structures via Self-Adjusting Computation Dynamize the static algorithm that takes as input the list of operations performed Example: retroactive queues Input: list of insert/remove operations Output: list of items removed Retroactive change: change the input list and propagate

45 14 January 2004Workshop on Dynamic Algorithms and Applications45 Rake and Compress Trees [Acar,Blelloch,Vittes] Obtained by dynamizing tree contraction [ABHVW ‘04] Experimental analysis Implemented and applied to a broad set of applications Path queries, subtree queries, non-local queries etc. For path queries, compared to Link-Cut Trees [Werneck] Structural changes are relatively slow Data changes are faster

46 14 January 2004Workshop on Dynamic Algorithms and Applications46 Conclusions Automatic dynamization techniques can yield efficient dynamic and kinetic algorithms/data structures General-purpose techniques for transforming static algorithms to dynamic and kinetic analyzing their performance Applications to kinetic and retroactive data structures Reduce dynamic problems to static problems Future work: Lots of interesting problems Dynamic/kinetic/retroactive data structures

47 14 January 2004Workshop on Dynamic Algorithms and Applications47 Thank you!


Download ppt "Self-Adjusting Computation Umut Acar Carnegie Mellon University Joint work with Guy Blelloch, Robert Harper, Srinath Sridhar, Jorge Vittes, Maverick Woo."

Similar presentations


Ads by Google