CS216: Program and Data Representation

Slides:



Advertisements
Similar presentations
Elementary Data Structures: Part 2: Strings, 2D Arrays, Graphs
Advertisements

1 Union-find. 2 Maintain a collection of disjoint sets under the following two operations S 3 = Union(S 1,S 2 ) Find(x) : returns the set containing x.
Cs1120 Fall 2009 David Evans Lecture 16: Power Analysis.
Algorithms Analysis Lecture 6 Quicksort. Quick Sort Divide and Conquer.
Greedy Algorithms Greed is good. (Some of the time)
22C:19 Discrete Structures Trees Spring 2014 Sukumar Ghosh.
CHAPTER 5 PRIORITY QUEUES (HEAPS) §1 ADT Model Objects: A finite ordered list with zero or more elements. Operations:  PriorityQueue Initialize( int.
Spring 2015 Lecture 5: QuickSort & Selection
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
Randomized Algorithms Kyomin Jung KAIST Applied Algorithm Lab Jan 12, WSAC
CSL758 Instructors: Naveen Garg Kavitha Telikepalli Scribe: Manish Singh Vaibhav Rastogi February 7 & 11, 2008.
1 Algorithms for Large Data Sets Ziv Bar-Yossef Lecture 8 May 4, 2005
Lecture 12 Oct 13, 2008 Some simple recursive programs recursive problem solving, connection to induction Some examples involving recursion Announcements.
Lecture 4 Feb 5 completion of recursion (inserting into a linked list as last item) analysis of algorithms – Chapter 2.
Data Structures – LECTURE 10 Huffman coding
Study Group Randomized Algorithms Jun 7, 2003 Jun 14, 2003.
CS420 lecture eight Greedy Algorithms. Going from A to G Starting with a full tank, we can drive 350 miles before we need to gas up, minimize the number.
CS 46B: Introduction to Data Structures July 30 Class Meeting Department of Computer Science San Jose State University Summer 2015 Instructor: Ron Mak.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 25 There.
1 Min-cut for Undirected Graphs Given an undirected graph, a global min-cut is a cut (S,V-S) minimizing the number of crossing edges, where a crossing.
The Best Algorithms are Randomized Algorithms N. Harvey C&O Dept TexPoint fonts used in EMF. Read the TexPoint manual before you delete this box.: A AAAA.
Cs3102: Theory of Computation Class 18: Proving Undecidability Spring 2010 University of Virginia David Evans.
Huffman Encoding Veronica Morales.
Lecture Objectives  To learn how to use a Huffman tree to encode characters using fewer bytes than ASCII or Unicode, resulting in smaller files and reduced.
Dijkstra’s Algorithm. Announcements Assignment #2 Due Tonight Exams Graded Assignment #3 Posted.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 4: Dynamic Programming, Trees
Discrete Structures Lecture 12: Trees Ji Yanyan United International College Thanks to Professor Michael Hvidsten.
Introduction to Algorithms Chapter 16: Greedy Algorithms.
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
Preview  Graph  Tree Binary Tree Binary Search Tree Binary Search Tree Property Binary Search Tree functions  In-order walk  Pre-order walk  Post-order.
Huffman Codes Juan A. Rodriguez CS 326 5/13/2003.
Nattee Niparnan. Greedy If solving problem is a series of steps Simply pick the one that “maximize” the immediate outcome Instead of looking for the long.
Bahareh Sarrafzadeh 6111 Fall 2009
Trees (Ch. 9.2) Longin Jan Latecki Temple University based on slides by Simon Langley and Shang-Hua Teng.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 6: Ordered Data Abstractions
Lossless Decomposition and Huffman Codes Sophia Soohoo CS 157B.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 8: Crash Course in Computational Complexity.
1 Algorithms CSCI 235, Fall 2015 Lecture 39 Final Exam Review.
Great Theoretical Ideas in Computer Science for Some.
David Evans CS201J: Engineering Software University of Virginia Computer Science Lecture 7: A Tale of Two Graphs (and.
The Best Algorithms are Randomized Algorithms
B/B+ Trees 4.7.
Randomized Min-Cut Algorithm
5.13 Recursion Recursive functions Functions that call themselves
Chapter 5. Greedy Algorithms
Greedy Method 6/22/2018 6:57 PM Presentation for use with the textbook, Algorithm Design and Applications, by M. T. Goodrich and R. Tamassia, Wiley, 2015.
Randomized Algorithm (Lecture 2: Randomized Min_Cut)
The Greedy Method and Text Compression
Minimum Spanning Tree 8/7/2018 4:26 AM
The Greedy Method and Text Compression
CS 3343: Analysis of Algorithms
Chapter 8 – Binary Search Tree
CS 3343: Analysis of Algorithms
Quick-Sort 11/14/2018 2:17 PM Chapter 4: Sorting    7 9
CSPs: Search and Arc Consistency Computer Science cpsc322, Lecture 12
Lecture 16: Quickest Sorting CS150: Computer Science
First-Cut Techniques Lecturer: Jing Liu
Randomized Algorithms CS648
Merge Sort 11/28/2018 2:21 AM The Greedy Method The Greedy Method.
Lecture 5: Lexical Analysis III: The final bits
Chapter 13 Graph Algorithms
CS 583 Analysis of Algorithms
Lecture 7: A Tale of Two Graphs CS201j: Engineering Software
Greedy Algorithms Alexandra Stefan.
Ch. 2: Getting Started.
Lecture 15: Quicker Sorting CS150: Computer Science
Quicksort and Randomized Algs
Huffman Coding Greedy Algorithm
Algorithms CSCI 235, Spring 2019 Lecture 31 Huffman Codes
Presentation transcript:

CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 26 http://www.cs.virginia.edu/cs216

Bad News/Good News Bad News Good News 54 people listed review as their most-preferred class, but only 1 person sent in any review questions! Good News This means we have plenty of time to finish the randomized graph algorithm!

Minimum Cut Problem Input: an undirected, connected multigraph G = (V,E) Output: A cut (V1,V2 where V1∩ V2 = V and V1  V2 = ) such that number of edges between V1 and V2 is the fewest possible. Why might this be useful? Equivalent: fewest edges that can be removed to disconnect G.

Minimum Cut C A B D Size of the min cut must be no larger than the smallest node degree in graph

Internet Minimum Cut June 1999 Internet graph, Bill Cheswick http://research.lumeta.com/ches/map/gallery/index.html

Randomized Algorithm While |V| > 2: Pick a random edge (x, y) from E Contract the edge: Keep multi-edges, remove self-loops Combine nodes The two remaining nodes represent reasonable choices for the minimum cut sets

Analysis Suppose C is a minimum cut (set of edges that disconnects G) When we contract edge e: Unlikely that e  C So, C is likely to be preserved What is the probability a randomly choosen edge is in C?

Analysis Is the final result a cut of G? What is the probability we find a minimum cut?

Random Edge in C? |C| must be  degree of every node in G How many edges in G: |E| = sum of all node degrees / 2  n |C| / 2 Probability a random edge is in C  2/n

Iteration How many iterations? n - 2 Probability for first iteration: Prob(e1  C)  1 – 2/n Probability for second iteration: Prob(e2  C | e1  C)  1 – 2/(n-1) ... Probability for last iteration: Prob(en-2  C)  1 – 2/(n-(n-2-1))  1 – 2/3 n - 2

Probability of finding C?  (1 – 2/n) * (1 – 2/(n – 1)) * (1 – 2/(n – 2)) ... * (1 – 2/3) = (n – 2 / n) * (n – 3/(n – 1)) * (n – 4/(n – 2)) * ...* (2/4) * (1/3) = 2 / (n * (n – 1)) Probability of not finding C = 1 – 2/(n*(n-1))

Is this good enough? Probability of not finding C on one trial:  1 – 2/(n*(n-1))  1 – 2/n2 Probability of not finding C on k trials:  [1 – 2/n2]k If k = cn2, Prob failure  (1/e)c Recall: lim (1 – 1/x)x = 1/e x  

Review Questions Explain problem 7 from problem set 2 Explain problem 7 from problem set 4, specifically why since "depth < n" it is not considered in the running time. 

PS2, Problem 7 Consider this code excerpt that prints out all possible partitions of the list s: for p1, p2 in allPossiblePartitions (s): print p1, p2 Use n to represent the number of elements in s. You may assume print is O(1). What is its asymptotic running time? What is its memory usage?

def allPossiblePartitions (items): if len(items) == 1: yield [items[0]], [] yield [], [items[0]] else: for left, right in allPossiblePartitions (items[1:]): lplus = left[:] lplus.insert (0, items[0]) yield lplus, right rplus = right[:] rplus.insert (0, items[0]) yield left, rplus

PS 4, Question 7 What is the asymptotic running time of our htree_encodeChars procedure? You may assume the input string is long enough that the time taken to produce the Huffman encoding tree does not matter (so you do not have to consider the running time of htree_buildTree and htree_unparse in your answer).

htree_encodeChars void htree_encodeChars (char *s, FILE *outfile) { /* first, output the htree encoding */ htree h; h = htree_buildTree (s); fprintf (outfile, "%s\n", htree_unparse (h)); while (*s != '\0') { char c = *s++; char *bits = htree_encodeChar (h, c); fprintf (outfile, "%s", bits); }

htree_encodeChar How many calls to htree_encodeChar? char *htree_encodeChar (htree h, char c) { if (h == NULL) { return NULL; } else if (htree_isLeaf (h)) { if (h->letter == c) { ... /* elided for now */ } else { char *res = htree_encodeChar (h->left, c); if (res == NULL) { res = htree_encodeChar (h->right, c); } return res; How many calls to htree_encodeChar?

htree_encodeChar } else if (htree_isLeaf (h)) { if (h->letter == c) { int depth = 0; char *res; htree ht = h; while (ht->parent != NULL) { depth++; ht = ht->parent; } res = (char *) malloc (sizeof (*res) * (depth + 1)); if (res == NULL) { ...; exit (EXIT_FAILURE); } res[depth] = '\0'; ht = h->parent; while (depth > 0) { if (ht->left == h) { res[depth - 1] = '0'; } else if (ht->right == h) { res[depth - 1] = '1'; } else { fprintf (stderr, "Error! Bad tree!"); ...; } h = ht; ht = ht->parent; depth--; return res; } else { return NULL; }

Charge Final Exam out now Return before 4:59pm on Monday, May 8 Turn in to me at my office or folder in Brenda Perkins’ office (front of Olsson Hall)