Presentation is loading. Please wait.

Presentation is loading. Please wait.

§3 Worst-Case vs. Amortized

Similar presentations


Presentation on theme: "§3 Worst-Case vs. Amortized"— Presentation transcript:

1 §3 Worst-Case vs. Amortized
Motivating example: Repeated binary increment, #bit flips when counting to n? Inc & decrement ? 1 10 11 100 101 …… 111…111 One increment of value <n: O(log n) n increments: j<n log j = Θ(n·log n) Bit #0 flips n/2 times, bit #1 incurs n/4 flips, bit #2: n/4 flips,  ≤ 2n = Θ(n): constant (2) per inc Potential method of analysis: Let cj denote cost of j-th operation, Φj:=#1s in counter after j-th op.  before (j+1)-st op. 1jn cj/n  maxj (cj+Φj−Φj-1) + (Φ0−Φn)/n Φ0 = 0, Φj  0 = 2 - Φn/n

2 §3 Amortized Analysis 1jn cj/n  maxj (cj+Φj−Φj-1) + (Φ0−Φn)/n
Motivating example: Repeated binary increment, #bit flips when counting to n? Inc & decrement ? Definition: Fix an implementation A of abstract data type D with methods M1, M2,…,Mk. Let T(n) denote the worst-case cost of any n-element sequence C of calls C1,C2,…,Cn  {M1,…,Mk}. Then the amortized cost of A is defined as T(n)/n. Don't confuse: amortized cost expected cost Potential method of analysis: Let cj denote cost of j-th operation, Conceive Φj such that right hand side is „small“ average-case cost 1jn cj/n  maxj (cj+Φj−Φj-1) + (Φ0−Φn)/n

3 Worst, Average-Case, Expected, Amortized
Algorithm A, fixed input size n: worst case runtime average case (eg uniform distribution) best case Inputs: A B C D E F G H I J K asymptotically as n→∞

4 Worst, Average-Case, Expected, Amortized
Randomized algorithm A, fixed input X: worst case runtime expected case best case Trial # worst-case over all X of length n asympt. as n→∞

5 Worst, Average-Case, Expected, Amortized
Data type D, method M worst case runtime amortized cost best case Call # … n Methods M1,…Mk: worst-case over seq. of length n

6 §3 Relaxed Binomial Trees
A relaxed Binomial Tree of order k1 consists of a root with k children, jth (j=1…k) being a binom.tree of order j-1 relaxed Bk B0 B0 Bk Bk-1 B1 B2 B1 relaxed "Mark" indicates child #j may have order =j-2  j-2 Lemma: A relaxed Binomial Tree of order k has Fk+2Ω(1.6k) nodes Binom.Tree of deg. k: =2k nodes AVL Tree of deg. k:  Fk+3 nodes Merge Prune Proof by induction + homework: 1 + F1 + F2 + … + Fk = Fk+2  φk φ := (1+√5)/2 > 1.6 Fibonacci no.s Fk 1 1

7 §3 Fibonacci Heaps Bk H A relaxed Binomial Tree
of order k1 consists of a root with k children, jth (j=1…k) being a relaxed binom.tree of order j-2 Bk B0 B0 Bk-1 B1 H min : child j may have order j-2 A Fibonacci Heap H is a list of t heap-ordered relaxed binomial trees with pointer to the min. Lemma: A relaxed Binomial Tree of order k has Fk+2Ω(1.6k) nodes Lemma: A relaxed Binomial Tree of n nodes has order k  O(log n) Extract min.key: O(log n) Decrease key: O(1) Merge two Fib.heaps: O(1) Insert element: O(1) Create 1-elem.Fib.heap: O(1) amor- tized cost

8 §3 Extract Minimum t cj + Φj–Φj-1  O(log n)
A relaxed Binomial Tree of order k1 consists of a root with k children, jth (j=1…k) being a relaxed binom.tree of order j-2 Bk B0 B0 Bk-1 B1 H min : child j may have order j-2 A Fibonacci Heap H is a list of t heap-ordered relaxed binomial trees with pointer to the min. Lemma: A relaxed Binomial Tree of n nodes has order k  O(log n) t Extract min.key: O(log n) amor- tized cost Delete target of min.pointr Merge two Fibonacci heaps. Consolidate s.t. each tree order k occurs only once! bucket sort new? ! ! cj + Φj–Φj-1  O(log n) jn cj/n  maxj (cj + Φj – Φj-1) + (Φ0−Φn)/n Potential Φ := Θ( t )  O(log n) Φ0=0, Φn≥0

9 §3 Decrease Key cj + Φj–Φj-1  O(1) Φ0=0, Φn≥0 Bk H
A relaxed Binomial Tree of order k1 consists of a root with k children, jth (j=1…k) being a relaxed binom.tree of order j-2 Bk B0 B0 Bk-1 B1 H min : child j may have order j-2 A Fibonacci Heap H is a list of t heap-ordered relaxed binomial trees with pointer to the min. Lemma: A relaxed Binomial Tree of n nodes has order k  O(log n) Decrease key: O(1) amor- tized cost cut subtree mark parent if already marked: reset, cut & cascade up X cj + Φj–Φj-1  O(1) Potential Φ := C · t Potential Φ:=Θ(t+2m) Φ0=0, Φn≥0

10 §3 Cuts, Marks, and Cascading
Decrease key: O(1) cut subtree mark parent if already marked: reset, cut & cascade up X cj + Φj – Φj-1  O(1) Potential Φ:=C·(t+2m) Φ0=0, Φn≥0


Download ppt "§3 Worst-Case vs. Amortized"

Similar presentations


Ads by Google