En tropy as Computational Complexity Computer Science Replugged Tadao Takaoka Department of Computer Science University of Canterbury Christchurch, New.

Slides:



Advertisements
Similar presentations
Single Source Shortest Paths
Advertisements

Priority Queues  MakeQueuecreate new empty queue  Insert(Q,k,p)insert key k with priority p  Delete(Q,k)delete key k (given a pointer)  DeleteMin(Q)delete.
Greed is good. (Some of the time)
CSC 2300 Data Structures & Algorithms April 13, 2007 Chapter 9. Graph Algorithms.
Minimum Spanning Trees Definition Two properties of MST’s Prim and Kruskal’s Algorithm –Proofs of correctness Boruvka’s algorithm Verifying an MST Randomized.
The selfish-edges Shortest-Path problem. The selfish-edges SP problem Input: an undirected graph G=(V,E) such that each edge is owned by a distinct selfish.
1 Theory I Algorithm Design and Analysis (10 - Shortest paths in graphs) T. Lauer.
Theory of Computing Lecture 8 MAS 714 Hartmut Klauck.
By Amber McKenzie and Laura Boccanfuso. Dijkstra’s Algorithm Question: How do you know that Dijkstra’s algorithm finds the shortest path and is optimal.
1 Greedy 2 Jose Rolim University of Geneva. Algorithmique Greedy 2Jose Rolim2 Examples Greedy  Minimum Spanning Trees  Shortest Paths Dijkstra.
CPSC 411, Fall 2008: Set 9 1 CPSC 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Fall 2008.
Discussion #36 Spanning Trees
CSE 780 Algorithms Advanced Algorithms Minimum spanning tree Generic algorithm Kruskal’s algorithm Prim’s algorithm.
CSE 421 Algorithms Richard Anderson Dijkstra’s algorithm.
1 7-MST Minimal Spanning Trees Fonts: MTExtra:  (comment) Symbol:  Wingdings: Fonts: MTExtra:  (comment) Symbol:  Wingdings:
Greedy Algorithms Reading Material: Chapter 8 (Except Section 8.5)
DAST 2005 Tirgul 12 (and more) sample questions. DAST 2005 Q.We’ve seen that solving the shortest paths problem requires O(VE) time using the Belman-Ford.
CSE 421 Algorithms Richard Anderson Lecture 10 Minimum Spanning Trees.
Fibonacci Heaps. Single Source All Destinations Shortest Paths
Greedy Algorithms Like dynamic programming algorithms, greedy algorithms are usually designed to solve optimization problems Unlike dynamic programming.
1 CSE 417: Algorithms and Computational Complexity Winter 2001 Lecture 11 Instructor: Paul Beame.
Dijkstra’s Algorithm Slide Courtesy: Uwash, UT 1.
Princeton University COS 423 Theory of Algorithms Spring 2002 Kevin Wayne Fibonacci Heaps These lecture slides are adapted from CLRS, Chapter 20.
Theory of Computing Lecture 7 MAS 714 Hartmut Klauck.
Graphs – Shortest Path (Weighted Graph) ORD DFW SFO LAX
Minimal Spanning Trees What is a minimal spanning tree (MST) and how to find one.
1 Binomial heaps, Fibonacci heaps, and applications.
1 GRAPHS - ADVANCED APPLICATIONS Minimim Spanning Trees Shortest Path Transitive Closure.
9/10/10 A. Smith; based on slides by E. Demaine, C. Leiserson, S. Raskhodnikova, K. Wayne Adam Smith Algorithm Design and Analysis L ECTURE 8 Greedy Graph.
1 Shortest Path Algorithms Andreas Klappenecker [based on slides by Prof. Welch]
1 WEEK 9-10 Graphs II Unweighted Shortest Paths Dijkstra’s Algorithm Graphs with negative costs Acyclic Graphs Izmir University of Economics.
Algorithms for Enumerating All Spanning Trees of Undirected and Weighted Graphs Presented by R 李孟哲 R 陳翰霖 R 張仕明 Sanjiv Kapoor and.
1 Fibonacci heaps, and applications. 2 Yet a better MST algorithm (Fredman and Tarjan) Iteration i: We grow a forest, tree by tree, as follows. Start.
Binomial heaps, Fibonacci heaps, and applications
2IL05 Data Structures Fall 2007 Lecture 13: Minimum Spanning Trees.
Spring 2015 Lecture 11: Minimum Spanning Trees
Kruskal’s and Dijkstra’s Algorithm.  Kruskal's algorithm is an algorithm in graph theory that finds a minimum spanning tree for a connected weighted.
Partial Soluti on and Entropy Tadao Takaoka Department of Computer Science University of Canterbury Christchurch, New Zealand.
The Shortest-Path problem in graphs with selfish-edges.
1 Fat heaps (K & Tarjan 96). 2 Goal Want to achieve the performance of Fibonnaci heaps but on the worst case. Why ? Theoretical curiosity and some applications.
CSE 421 Algorithms Richard Anderson Winter 2009 Lecture 5.
CSCE 411 Design and Analysis of Algorithms Set 9: More Graph Algorithms Prof. Jennifer Welch Spring 2012 CSCE 411, Spring 2012: Set 9 1.
1 Fibonacci heaps: idea List of multiway trees which are all heap-ordered. Definition: A tree is called heap-ordered if the key of each node is greater.
Lecture 12 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CSE 589 Applied Algorithms Spring 1999 Prim’s Algorithm for MST Load Balance Spanning Tree Hamiltonian Path.
Proof of correctness of Dijkstra’s algorithm: Basically, we need to prove two claims. (1)Let S be the set of vertices for which the shortest path from.
November 22, Algorithms and Data Structures Lecture XII Simonas Šaltenis Nykredit Center for Database Research Aalborg University
Minimum Spanning Tree Graph Theory Basics - Anil Kishore.
Introduction to Algorithms
ליאור שפירא, חיים קפלן וחברים
Binomial heaps, Fibonacci heaps, and applications
Heaps Binomial Heaps Lazy Binomial Heaps 1.
Lecture 12 Algorithm Analysis
Greedy Algorithms / Minimum Spanning Tree Yin Tat Lee
Minimum-Cost Spanning Tree
CSCE350 Algorithms and Data Structure
ICS 353: Design and Analysis of Algorithms
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Autumn 2015 Lecture 10 Minimum Spanning Trees
Lecture 13 Algorithm Analysis
Lecture 13 Algorithm Analysis
Autumn 2016 Lecture 10 Minimum Spanning Trees
Graph Algorithms: Shortest Path
CSE 417: Algorithms and Computational Complexity
Binomial heaps, Fibonacci heaps, and applications
Lecture 12 Algorithm Analysis
Winter 2019 Lecture 10 Minimum Spanning Trees
Fundamental Structures of Computer Science II
Minimum-Cost Spanning Tree
Presentation transcript:

En tropy as Computational Complexity Computer Science Replugged Tadao Takaoka Department of Computer Science University of Canterbury Christchurch, New Zealand

Genera framework and motivation If the given problem is partially solved, how much more time is needed to solve the problem completely. AlgorithmInput data Saved data SaveRecover Output Time =Time spent + Time to be spent A possible scenario: Suppose computer is stopped by power cut and partially solved data are saved by battery. After a while power is back on, data are recovered, and computation resumes. How much more time ? Estimate from the partially solved data.

Entropy Thermodynamics klog e (Q), Q number of states in the closed system, k Boltzman constant Shannon’s information theory –  [i=1, n] p i log(p i ), where n=26 for English Our entropy, algorithmic entropy to describe computational complexity

Definition of entropy Let X be the data set and X be decomposed like S(X)=(X 1, …, X k ). Each X i is solved. S(X) is a state of data, and abbreviated as S. Let p i =|X i |/|X|, and |X|=n. The entropy H(S) is defined by H(S) =  [i=1, k] |X i |log(|X|/|X i |) = -n  [i=1,k] p i log(p i )  p i = 1, 0  H(S)  nlog(k), maximum when all |X i | are equal to 1/k.

Amortized analysis The accounting equation at the i-th operation becomes a i = t i - ΔH(S i ), actual time – decrease of entropy where ΔH(S i ) = H(S i-1 ) - H(S i ). Let T and A be the actual total time and the amortized total time. Summing up a i for i=1,..., N, we have A = T + H(S N ) - H(S 0 ), or T = A + H(S 0 ) - H(S N ). In many applications, A=0 and H(S N )=0, meaning T=H(S 0 ) For some applications, a i = t i - cΔH(S i ) for some constant c

Problems analyzed by entropy Minimal Mergesort : merging shortest ascending runs Shortest Path Problem: Solid parts solved. Make a single shortest path spanning tree Minimum Spanning Tree : solid part solved. Make a single tree souurce

Three examples with k=3 (1) Minimal mergesort Time =O(H(S)) Xi are ascending runs S(X) = ( ) X1=(2 5 6), X2 =(1 4 7), X3=(3 8 9) (2) Shortest paths for nearly acyclic graphs Time=O(m+H(S)) G=(V, E) is a graph. S(V)=(V1, V2, V3) V i is an acyclic graph dominated by v i (3) Minimum spanning tree Time = O(m+H(S)) S(V)=(V1, V2, V3), subgraph Gi=(Vi, Ei) is the induced graph from Vi. We assume minimum spanning tree Ti for Gi is already obtained

Time complexities of the three problems Minimal mergesort O(H(S)) worst case O(nlog(n)) Single source shortest paths O(m+H(S)) worst case time O(m+nlog(n)) data structures: Fibonacci heap or 2-3 heap Minimum cost spanning trees O(m+H(S)) Presort of edges (mlog(n)) excluded worst case time O(m+nlog(n))

Which is more sorted? Entropy H(S) = nlog(k), k=4 Entropy H(S) = O(n), more sorted

Minimal mergesort picture Metasort ML ...... W1W1 W2W2 W S(X) S’(X) merge...

Minimal Mergesort M  L : first list of L is moved to the last of M Meta-sort S(X) into S’(X) by length of X i Let L = S’(X); /* L : list of lists */ M=φ; M  L; /* M : list of lists */ If L is not empty, M  L; for i=1 to k-1 do begin W 1  M; W 2  M; W=merge(W 1, W 2 ); While L  φ and |W|>first(L) do M  L M  L End

Merge algorithm for lists of lengths m and n (m ≦ n) in time O(mlog(1+n/m)) by Brown and Tarjan Lemma. Amortized time for i-th merge a i ≦ 0 Proof. Let |W 1 |=n 1 and |W 2 |=n 2 ΔH = n 1 log(n/n 1 )+n 2 log(n/n 2 )-(n 1 +n 2 )log(n/(n 1 +n 2 ) = n 1 log(1+n 2 /n 1 )+n 2 log(1+n 1 /n 2 ) t i ≦ O(n 1 log(1+n 2 /n 1 )+n 2 (log(1+n 1 /n 2 )) a i = t i – cΔH ≦ 0

Main results in minimal mergesort Theorem. Minimal mergesort sorts sequence S(X) in O(H(S)) time If pre-scanning is included, O(n+H(S)) Theorem. Any sorting algorithm takes  (H(S)) time if |X i |  2 for all i. If |X i |=1 for all i, S(X) is reverse-sorted, and it can be sorted in ascending order, in O(n) time.

Minimum spanning trees Blue fonts : vertices T1 T2 T3 L=( )  L=( ) name =( )  ( )

Kruskal’s completion algorithm 1 Let the sorted edge list L be partially scanned 2 Minimum spanning trees for G 1,..., G k have been obtained 3 for i=1 to k do for v in V k do name[v]:=k 4 while k > 1 do begin 5 Remove the first edge (u, v) from L 6 if u and v belong to different sub-trees T1 and T2 7 then begin 8 Connect T1 and T2 by (u, v) 9 Change the names of the nodes in the smaller tree to that of the larger tree; 10 k:=k - 1; 11 end 12 end.

Entropy analysi s for name changes the decrease of entropy is ΔH = n 1 log(n/n 1 ) + n 2 log(n/n 2 ) -(n 1 +n 2 )log(n/(n 1 +n 2 )) = n 1 log(1+n 2 /n 1 ) + n 2 log(1+n 1 /n 2 ) ≥ min{n 1, n 2 } Noting that t i <= min{n 1, n 2 }, amortized time becomes a i = t i - ΔH(S i ) ≦ 0 Scanning L takes O(m) time. Thus T=(m+H(S 0 )), where H(S 0 ) is the initial entropy.

Single source shortest paths Expansion of solution set S: solution set of vertices to which shortest distances are known F: frontier set of vertices connected from solution set by single edges v w w S : solution setF : frontier s Time = O(m + nlogn)

Priority queue F is maintained in Fibonacci or 2-3 heap Delete-min O(log n) Decrease key O(1) Insert O(1)

Dijkstra’s algorithm for shortest paths with a priority queue, heap with time= O(m+nlog(n)) d[s]=0; S={s}; F={w|(s,w) in out(s)}; d[v]=c[s,v] for all v in F; while |S|<n do delete v from F such that d[v] is minimum // delete-min add v to S O(log n) for w in out(v) do if w is not in S then if w is if F then d[w]=min{d[v], d[v]+c[v,w] // decrease-key O(1) else {d[w]=d[v]+c[v,w]; add w to F} // insert O(1) end do

Sweeping algorithm Let v 1, …, v n be topologically sorted d[v 1 ]=0; for i=2 to n do d[v i ]=  for i=1 to n do do for w in out(v i ) do d[w]=min{d[v i ], d[v i ]+c[v i,w]} Time = O(m)

Efficient shortest path algorithm for nearly acyclic graphs d[s]=0 S={s}; F={w|(s,w) in out(s)}; // F is organized as priority queue while |S|<n do if there is a vertex in F with no incoming edge from V-S then choose v // easy vertex O(1) else choose v from F such that d[v] is minimum //difficult O(log n) add v to S find-min Delete v from F // delete for w in out(v) do if w is not in S then if w is in F then d[w]=min{d[v], d[v]+c[v,w] //decrease-key else {d[w]=d[v]+c[v,w]; add w to F} //insert O(1) end do

Easy vertices and difficult vertices v w ws difficult easy S F

Nearly acyclic graph Acyclic components are regarded as solved There are three acyclic components in this graph. Vertices in the component V i can be deleted from the queue once the distance to the trigger v i is finalized ViVi uiui Acyclic graph topologically sorted

Entropy analysis of the shortest path algorithm Generalization delete-min = (find-min, delete) to (find-min, delete, …, delete) There are t difficult vertices u 1, …, u t. Each u i and the following easy vertices form an acyclic sub-graph. Let {v 1,…, v k }, where v 1 =u i for some i, be one of the roots of the above acyclic sub-graphs. Let the number of descendants of v i in the heap be n i. Delete v 1, …, v k. Time = log(n 1 )+…+log(n k ) ≦ O(klog(n/k). Let us index k by i for u i. Then the total time for deletes is O(k 1 log(n/k 1 )+…+k t log(n/k t ) = O(H(S)) where S(V)=(V 1, …, V t ). Time O(tlog n) for t find-mins is absorbed in O(H(S)) The rest of the time is O(m). Thus total time is O(m+O(H(S))