Lower bound: Decision tree and adversary argument

Slides:



Advertisements
Similar presentations
Comp 122, Spring 2004 Order Statistics. order - 2 Lin / Devi Comp 122 Order Statistic i th order statistic: i th smallest element of a set of n elements.
Advertisements

Introduction to Algorithms Quicksort
110/6/2014CSE Suprakash Datta datta[at]cse.yorku.ca CSE 3101: Introduction to the Design and Analysis of Algorithms.
MS 101: Algorithms Instructor Neelima Gupta
Analysis of Algorithms CS 477/677 Linear Sorting Instructor: George Bebis ( Chapter 8 )
MS 101: Algorithms Instructor Neelima Gupta
Lower bound for sorting, radix sort COMP171 Fall 2005.
Lecture 12: Lower bounds By "lower bounds" here we mean a lower bound on the complexity of a problem, not an algorithm. Basically we need to prove that.
Deterministic Selection and Sorting Prepared by John Reif, Ph.D. Analysis of Algorithms.
Using Divide and Conquer for Sorting
Lower bound for sorting, radix sort COMP171 Fall 2006.
Lecture 5: Linear Time Sorting Shang-Hua Teng. Sorting Input: Array A[1...n], of elements in arbitrary order; array size n Output: Array A[1...n] of the.
CS 253: Algorithms Chapter 8 Sorting in Linear Time Credit: Dr. George Bebis.
Tirgul 10 Rehearsal about Universal Hashing Solving two problems from theoretical exercises: –T2 q. 1 –T3 q. 2.
Comp 122, Spring 2004 Lower Bounds & Sorting in Linear Time.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
CSC 2300 Data Structures & Algorithms March 27, 2007 Chapter 7. Sorting.
© 2004 Goodrich, Tamassia Sorting Lower Bound1. © 2004 Goodrich, Tamassia Sorting Lower Bound2 Comparison-Based Sorting (§ 10.3) Many sorting algorithms.
Lecture 5: Master Theorem and Linear Time Sorting
2 -1 Chapter 2 The Complexity of Algorithms and the Lower Bounds of Problems.
CSE 326: Data Structures Sorting Ben Lerner Summer 2007.
Analysis of Algorithms CS 477/677
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
David Luebke 1 7/2/2015 Linear-Time Sorting Algorithms.
Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Adversary Arguments( 反论 )  Suppose you are playing.
10/15/2002CSE More on Sorting CSE Algorithms Sorting-related topics 1.Lower bound on comparison sorting 2.Beating the lower bound 3.Finding.
Sorting Lower Bound1. 2 Comparison-Based Sorting (§ 4.4) Many sorting algorithms are comparison based. They sort by making comparisons between pairs of.
Lower Bounds for Comparison-Based Sorting Algorithms (Ch. 8)
Computer Algorithms Lecture 11 Sorting in Linear Time Ch. 8
Sorting in Linear Time Lower bound for comparison-based sorting
CSE 373 Data Structures Lecture 15
Ch. 8 & 9 – Linear Sorting and Order Statistics What do you trade for speed?
Order Statistics The ith order statistic in a set of n elements is the ith smallest element The minimum is thus the 1st order statistic The maximum is.
Complexity of algorithms Algorithms can be classified by the amount of time they need to complete compared to their input size. There is a wide variety:
David Luebke 1 10/13/2015 CS 332: Algorithms Linear-Time Sorting Algorithms.
CSC 41/513: Intro to Algorithms Linear-Time Sorting Algorithms.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
Analysis of Algorithms CS 477/677
TECH Computer Science Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Designing against an adversary.
Mudasser Naseer 1 11/5/2015 CSC 201: Design and Analysis of Algorithms Lecture # 8 Some Examples of Recursion Linear-Time Sorting Algorithms.
Instructor Neelima Gupta Table of Contents Review of Lower Bounding Techniques Decision Trees Linear Sorting Selection Problems.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
COSC 3101A - Design and Analysis of Algorithms 6 Lower Bounds for Sorting Counting / Radix / Bucket Sort Many of these slides are taken from Monica Nicolescu,
CSCE 411H Design and Analysis of Algorithms Set 10: Lower Bounds Prof. Evdokia Nikolova* Spring 2013 CSCE 411H, Spring 2013: Set 10 1 * Slides adapted.
A method for obtaining lower bounds
The Selection Algorithm : Design & Analysis [10].
Sorting Lower Bounds n Beating Them. Recap Divide and Conquer –Know how to break a problem into smaller problems, such that –Given a solution to the smaller.
Sorting & Lower Bounds Jeff Edmonds York University COSC 3101 Lecture 5.
Lecture 5 Algorithm Analysis Arne Kutzner Hanyang University / Seoul Korea.
CS6045: Advanced Algorithms Sorting Algorithms. Sorting So Far Insertion sort: –Easy to code –Fast on small inputs (less than ~50 elements) –Fast on nearly-sorted.
1 Chapter 8-1: Lower Bound of Comparison Sorts. 2 About this lecture Lower bound of any comparison sorting algorithm – applies to insertion sort, selection.
Lower Bounds & Sorting in Linear Time
Decision trees Polynomial-Time
CPSC 411 Design and Analysis of Algorithms
(2,4) Trees 11/15/2018 9:25 AM Sorting Lower Bound Sorting Lower Bound.
(2,4) Trees 12/4/2018 1:20 PM Sorting Lower Bound Sorting Lower Bound.
Linear Sorting Sorting in O(n) Jeff Chastine.
Chapter 11 Limitations of Algorithm Power
Lower Bounds & Sorting in Linear Time
Linear-Time Sorting Algorithms
(2,4) Trees 2/28/2019 3:21 AM Sorting Lower Bound Sorting Lower Bound.
Topic 5: Heap data structure heap sort Priority queue
Lower bound for sorting, radix sort
CPSC 411 Design and Analysis of Algorithms
Chapter 8: Overview Comparison sorts: algorithms that sort sequences by comparing the value of elements Prove that the number of comparison required to.
David Kauchak cs302 Spring 2012
CS 583 Analysis of Algorithms
The Selection Problem.
Algorithms CSCI 235, Spring 2019 Lecture 17 Quick Sort II
Presentation transcript:

Lower bound: Decision tree and adversary argument Lower bound: (f(n)) in the worst case. Decision tree model: Example: for comparison sorting. Adversary argument: Example: find the maximum

Comparison sort Comparison sort: Question: Insertion sort, O(n2), upper bound in worst case Merge sort, O(nlg n), upper bound in worst case Heapsort, O(nlg n), upper bound in worst case Quicksort, O(nlg n), in average case Question: what is the lower bounds for any comparison sorting: i.e., at least how many comparisons needed in worst case? It turns out: Lower bound in worst case: (nlg n), how to prove? Merge and Heapsort are asymptotically optimal comparison sorts.

Decision Tree Model Assumptions: Decision tree model All numbers are distinct (so no use for ai = aj ) All comparisons have form ai  aj (since ai  aj, ai  aj, ai < aj, ai > aj are equivalent). Decision tree model Full binary tree Internal node represents a comparison. Ignore control, movement, and all other operations, just see comparison Each leaf represents one possible result. The height (i.e., longest path) is the lower bound.

Decision tree model  > >  >  <1,2,3> <2,1,3>  1:2 > 2:3 1:3 >  >  <1,2,3> 1:3 <2,1,3> 2:3  >  > <1,3,2> <3,1,2> <2,3,1> <3,2,1> Internal node i:j indicates comparison between ai and aj. suppose three elements < a1, a2, a3> with instance <6,8,5> Leaf node <(1), (2), (3)> indicates ordering a(1) a(2) a(3). Path of bold lines indicates sorting path for <6,8,5>. There are total 3!=6 possible permutations (paths).

Lower bound: for comparison sort The Longest path is the worst case number of comparisons. The length of the longest path is the height of the decision tree. Theorem 8.1: Any comparison sort algorithm requires (nlg n) comparisons in the worst case. Proof: Suppose height of a decision tree is h, and number of paths (i,e,, permutations) is n!. Since a binary tree of height h has at most 2h leaves, n!  2h , so h  lg (n!)  (nlg n) (By equation 3.18). That is to say: any comparison sort in the worst case needs at least nlg n comparisons.

Maximum: O(n) MAXIMUM(A) maxA[1] for i 2 to length[A] do if max<A[i] then max A[i] return max Running time: O(n), n-1 comparisons are sufficient. Similar for minimum.

Lower bound for maximum n-1 is the upper bound of maximum, How about the lower bound (of worst case)? Suppose n elements are distinct. So n-1 elements are not maximum. Every comparison has only one loser. Therefore at least n-1 comparisons are needed. So lower bound is n-1. This method is called the tournament method. Can the decision tree is used for determining the lower bound for selection? Any one could be the maximum, so there are n leaves (output) Thus, the height will be lg n, If think lg n be the lower bound, it will be wrong. So decision tree does not apply here. Why? ((really n leaves? Duplicate leaves!)

Adversary argument Playing a guessing game between you and your friend. You are to pick up a date, and the friend will try to guess the date by asking YES/NO questions. Your purpose is forcing your friend to ask as many questions as possible. To question “is it in winter”, your answer should be NO. To question “is the first letter of the month’s name in the first half of the alphabet”? Your answer should be YES. Idea: You did not pick up a date in advance at all, but Construct a date according to the questions. The requirement is that the finally constructed date should be consistent to all your answers to the questions. Looks like cheating, but it is a good way to find the lower bound.

Adversary argument Suppose we have an algorithm we think efficient. Image an adversary tries to prove otherwise. At each point in the algorithm, whenever an decision (i.e., key comparison) is made, the adversary tells us the result of the decision. The adversary chooses the answer which tries to force the algorithm work hard (i.e., do a lot of decision, or to say, the answer releases as less new information as possible). You can think the adversary is constructing a “bad” input while it is answering the questions. The only requirement on the answers is that they must be internally consistent. If the adversary can force the algorithm to perform f(n) steps, then f(n) is the lower bound, i.e, at least how many steps in the worst case.

Adversary argument for maximum The adversary answers the questions as treating a[i]=i, for each i . To the query “is a[i]<a[j]”, adversary answers YES if (and only if) i<j. If the algorithm halts in less than n-1 comparisons and claims that the maximum is located in index k, then there will be one more non-loser (non-smaller element). Suppose the non-smaller element is located in the index j≠k Thus, the adversary can thus demonstrate this algorithm to be incorrect by claiming that the array holds the values a[i] = i for all i ≠ j and a[j] = n+1, thus making a[k] < a[j]. A contradiction! So the algorithm must have at least n-1 comparisons to determine the maximum. So the previous linear scan algorithm for maximum is optimal. If you design another algorithm, such as first divide into n/2 pairs and make comparisons to get winners, then divide the winners into pairs and make comparisons, …, the total number of comparisons is still n-1.

Adversary argument for finding both maximum and minimum. n distinct elements, Count each win and each lose as one unit of information. One maximum and one minimum, so n-1 loses for maximum and n-1 wins for minimum, thus, at least 2n-2 units of information is needed. The adversary gives the answer in the way that will give away as few as possible units of new information with each comparison.

Adversary argument—four status Denote the key status in any moment as: W: has won at least one comparison and never lost L: has lost at least one comparison and never won B: has both won and lost at least one comparison N: has not yet participated in a comparison

Adversary argument --strategy Status of x and y in comparison Adversary response New status Units of new information N, N x>y W,L 2 W,N 1 B, N B,L L,N x<y L,W W,W W,B B,W B,B Original values L,B L,L Each W or L is one unit of info., B: 2 units of info. Except B,B, either key chosen as winner has not lost any comparison or as loser has not won any comparison. Suppose the comparison is x and y, adversary chooses x as winner and x never loses. Even x value assigned previously is less than y, adversary can increases x to beat y without conflicting all previous answers.,

Adversary argument—proof Except N,N case, all other cases release at most 1 unit of information. So the algorithm compares two keys not involved in any comparison as much as possible: Suppose n is even, then divide to n/2 pairs. thus obtain n units of information. There needs at least n-2 comparison, each for at most 1 unit of information, thus, n/2+n-2=3n/2-2. If n is odd, (n-1)/2+n-1 =(3n-3)/2. in one formula: 3n/2 -2.

Minimum and Maximum: 3 n/2 comparisons MIN-MAX(A) //assume n is odd min maxA[1] for i 2 to length[A] step 2 do if A[i]<A[i+1] then if min>A[i] then min  A[i] if max<A[i+1] then max  A[i+1] else if min> A[i+1] then min  A[i+1] if max<A[i] then max  A[i] return min, max #pairs: n/2 #comparisons/per pair:3 # total comparisons: 3 n/2, i.e., 3n/2 -2. Similarly, write the code for n being even. The # total comparisons is 3n/2-2, i.e., 3n/2 -2. This algorithm achieves optimal.