Lecture 13: Adversary Arguments continues … This lecture, we continue with the adversary argument by giving more examples.

Slides:



Advertisements
Similar presentations
Comp 122, Spring 2004 Order Statistics. order - 2 Lin / Devi Comp 122 Order Statistic i th order statistic: i th smallest element of a set of n elements.
Advertisements

MS 101: Algorithms Instructor Neelima Gupta
Module 4 Game Theory To accompany Quantitative Analysis for Management, Tenth Edition, by Render, Stair, and Hanna Power Point slides created by Jeff Heyl.
MS 101: Algorithms Instructor Neelima Gupta
QuickSort Average Case Analysis An Incompressibility Approach Brendan Lucier August 2, 2005.
Chapter 6 Linear Programming: The Simplex Method
Lower bound: Decision tree and adversary argument
Online Scheduling with Known Arrival Times Nicholas G Hall (Ohio State University) Marc E Posner (Ohio State University) Chris N Potts (University of Southampton)
Outline. Theorem For the two processor network, Bit C(Leader) = Bit C(MaxF) = 2[log 2 ((M + 2)/3.5)] and Bit C t (Leader) = Bit C t (MaxF) = 2[log 2 ((M.
Lecture 12: Lower bounds By "lower bounds" here we mean a lower bound on the complexity of a problem, not an algorithm. Basically we need to prove that.
CS 3343: Analysis of Algorithms Lecture 14: Order Statistics.
Medians and Order Statistics
Introduction to Algorithms
Introduction to Algorithms Jiafen Liu Sept
CSC 2300 Data Structures & Algorithms March 16, 2007 Chapter 7. Sorting.
Deterministic Selection and Sorting Prepared by John Reif, Ph.D. Analysis of Algorithms.
CS216: Program and Data Representation University of Virginia Computer Science Spring 2006 David Evans Lecture 7: Greedy Algorithms
The number of edge-disjoint transitive triples in a tournament.
CPSC 121: Models of Computation
Finding the Median Algorithm : Design & Analysis [11]
1 Introduction to Computability Theory Lecture12: Reductions Prof. Amos Israeli.
Data Structures, Spring 2006 © L. Joskowicz 1 Data Structures – LECTURE 4 Comparison-based sorting Why sorting? Formal analysis of Quick-Sort Comparison.
CPSC 668Set 3: Leader Election in Rings1 CPSC 668 Distributed Algorithms and Systems Spring 2008 Prof. Jennifer Welch.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu.
November 10, 2009Introduction to Cognitive Science Lecture 17: Game-Playing Algorithms 1 Decision Trees Many classes of problems can be formalized as search.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Third Edition Additions by Shannon Steinfadt SP’05.
Approximation Algorithms
This material in not in your text (except as exercises) Sequence Comparisons –Problems in molecular biology involve finding the minimum number of edit.
Chapter 3: The Efficiency of Algorithms Invitation to Computer Science, C++ Version, Fourth Edition.
The k-server Problem Study Group: Randomized Algorithm Presented by Ray Lam August 16, 2003.
DAST 2005 Week 4 – Some Helpful Material Randomized Quick Sort & Lower bound & General remarks…
Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Adversary Arguments( 反论 )  Suppose you are playing.
Improved results for a memory allocation problem Rob van Stee University of Karlsruhe Germany Leah Epstein University of Haifa Israel WADS 2007 WAOA 2007.
Prime Numbers and Prime Factorization
Lecture 2 We have given O(n 3 ), O(n 2 ), O(nlogn) algorithms for the max sub-range problem. This time, a linear time algorithm! The idea is as follows:
Approximation Algorithms: The Subset-sum Problem
October 3, 2012Introduction to Artificial Intelligence Lecture 9: Two-Player Games 1 Iterative Deepening A* Algorithm A* has memory demands that increase.
C++ Programming: Program Design Including Data Structures, Fourth Edition Chapter 19: Searching and Sorting Algorithms.
1 COMP3040 Tutorial 1 Analysis of algorithms. 2 Outline Motivation Analysis of algorithms Examples Practice questions.
Order Statistics. Order statistics Given an input of n values and an integer i, we wish to find the i’th largest value. There are i-1 elements smaller.
The Integers. The Division Algorithms A high-school question: Compute 58/17. We can write 58 as 58 = 3 (17) + 7 This forms illustrates the answer: “3.
The Selection Problem. 2 Median and Order Statistics In this section, we will study algorithms for finding the i th smallest element in a set of n elements.
TECH Computer Science Problem: Selection Design and Analysis: Adversary Arguments The selection problem >  Finding max and min Designing against an adversary.
Dominance Since Player I is maximizing her security level, she prefers “large” payoffs. If one row is smaller (element- wise) than another,
Mathematical Proofs And how they are applied to Computer Olympiads.
Order Statistics ● The ith order statistic in a set of n elements is the ith smallest element ● The minimum is thus the 1st order statistic ● The maximum.
Instructor Neelima Gupta Table of Contents Review of Lower Bounding Techniques Decision Trees Linear Sorting Selection Problems.
Graph Colouring L09: Oct 10. This Lecture Graph coloring is another important problem in graph theory. It also has many applications, including the famous.
Chapter 18: Searching and Sorting Algorithms. Objectives In this chapter, you will: Learn the various search algorithms Implement sequential and binary.
Analysis of Algorithms CS 477/677 Instructor: Monica Nicolescu Lecture 7.
UNIT 5.  The related activities of sorting, searching and merging are central to many computer applications.  Sorting and merging provide us with a.
Asymptotic Behavior Algorithm : Design & Analysis [2]
1 CSC 421: Algorithm Design & Analysis Spring 2014 Complexity & lower bounds  brute force  decision trees  adversary arguments  problem reduction.
CS 3343: Analysis of Algorithms Lecture 19: Introduction to Greedy Algorithms.
ARTIFICIAL INTELLIGENCE (CS 461D) Princess Nora University Faculty of Computer & Information Systems.
CSCE 668 DISTRIBUTED ALGORITHMS AND SYSTEMS Spring 2014 Prof. Jennifer Welch CSCE 668 Set 3: Leader Election in Rings 1.
Introduction to Graph Theory
1 Ch.19 Divide and Conquer. 2 BIRD’S-EYE VIEW Divide and conquer algorithms Decompose a problem instance into several smaller independent instances May.
A method for obtaining lower bounds
Some Favorite Problems Dan Kleitman, M.I.T.. The Hirsch Conjecture 1. How large can the diameter of a bounded polytope defined by n linear constraints.
The Selection Algorithm : Design & Analysis [10].
Adversarial Search 2 (Game Playing)
Chapter 4 With Question/Answer Animations 1. Chapter Summary Divisibility and Modular Arithmetic - Sec 4.1 – Lecture 16 Integer Representations and Algorithms.
1 Chapter 7 Quicksort. 2 About this lecture Introduce Quicksort Running time of Quicksort – Worst-Case – Average-Case.
Iterative Deepening A*
The Subset Sum Game Revisited
Chapter 3: The Efficiency of Algorithms
RS – Reed Solomon List Decoding.
Chapter 3: The Efficiency of Algorithms
Presentation transcript:

Lecture 13: Adversary Arguments continues … This lecture, we continue with the adversary argument by giving more examples.

Problem: given a list of n elements, find the 2nd largest. [Reference: S. Baase, Computer Algorithms: Introduction to Design & Analysis, Second Edition, Addison-Wesley, 1991, pp ] Example 4. Largest and Second Largest

Find 2 nd largest element Naive algorithm: find the maximum element (n-1 comparisons) remove it from the list find the maximum among those remaining (n-2 comparisons) total = 2n-3 comparisons

Tournament algorithm Have players play in pairs, then winners play each other, etc., until a winner is determined. Then the "runner-up" (2nd largest) is among those defeated by winner. Why? Because any other player has at least TWO players superior to him/her \ / \ / \ / \ / \ / \ / \ / 12 Thus the total is n+logn -2

A matching lower bound for 2 nd largest We can prove that the algorithm just given is optimal, by an adversary argument similar to that we saw before. The result is due to S. S. Kislitsyn in The idea. First, note that any algorithm to find the 2nd largest element must "know" the largest element. (For otherwise the 2nd largest might be the largest!) To determine the largest element requires n-1 comparisons. The adversary argument we will give shows that we can always force ceil(lg n) elements to be compared with the maximum. This will prove what we want, since the runner-up is the maximum of these ceil(lg n) elements, requiring ceil(lg n)-1 comparisons to decide.

Adversary strategy: All keys are assigned weights w[i] Weights are all initialized to 1 Adversary replies are based on weights

When x is compared to y WeightsAdversary Reply Weight Changes w[x]>w[y]x>yw[x]:=w[x]+w[y]; w[y]:=0; w[x]=w[y]>0x>yw[x]:=w[x]+w[y]; w[y]:=0; w[y]>w[x]y>xw[y]:=w[y]+w[x]; w[x]:=0; w[x]=w[y]=0 Consistent None

Accumulation of weights Solution of the problem requires all weights to be accumulated to one key All other keys must have weight zero Since weight accumulates to the highest weight, weight can at most double with each comparison, lg n comparisons are required to accumulate all weights to the highest weight. And for these lg n comparisons, because the losers have non-0 weights, they have not lost before!

More formal argument We now show that this adversary strategy forces ceil(lg n) comparisons of max with elements that have never previously lost a comparison. Observe that w(k) ≤ 2 w(k-1) by the adversary strategy. It follows that w(k) ≤ 2 k w(0). But w(0) =1. Hence w(k) ≤ 2 k. We end at w(k) = n. Hence n ≤ 2 k. Hence k ≥ ceil(lg n). But k is the number of comparisons of max against elements that have never previously lost (because they have non-zero weights), so we have shown there must be at least ceil(lg n) such comparisons. There must be at least n-1 comparisons to find the maximum, and among the ceil(lg n) elements that lost comparisons to max, we need to do at least ceil(lg n)-1 comparisons to find the 2nd largest. Thus the lower bound is n+ lg n – 2, for the worst-case.

Example 5. Lower bound for Median The best upper bound is 3n + o(n) Best lower bound is 2n + o(n) by Dor and Zwick (focs96). We prove a 1.5n + O(1) worst-case lower bound here. If x is the median, We must have the following situation: * * \ / elements > median * * | / x median / | \ * * * elements < median | *

Plan: First, we call a comparison involving x crucial if it is the first comparison where (a) x < y and y ≤ median OR (b) x > y and y ≥ median. So the algorithm must make n-1 crucial comparisons. We need to show the algorithm also makes (n-1)/2 non crucial comparisons. The idea is as follows: first, the adversary picks some value (not some element) to be the median. Then it assigns a label in {N, L, S} to each element, and also values, as appropriate. Initially each element is assigned an "N" (= "Never participated in any comparison"). "L" means Larger than the median and "S" means Smaller than the median.

Adversary strategy The adversary follows the following strategy: Algorithm Adversary compares responds (N,N) assign one element to be larger than the median, one smaller; result is (L,S) (S,N) or (N,S) assign the N-element to be larger than the median; result is (S,L) or (L,S) (L,N) or (N,L) assign the N-element to be smaller than the median; result is (L,S) or (S,L) (S,L) or (L,S) consistent with previously assigned or (S,S) or (L,L) values

Proof continues … This strategy continues until (n-1)/2 S's (or (n-1)/2 L's) have been assigned. If at some point (n-1)/2 S's are assigned, then the adversary assigns the remaining elements to be greater than the median, except for one, which IS the median. A similar thing is done if (n-1)/2 L's have been assigned. The last element assigned is always the median. This strategy will always force the algorithm to perform (n-1)/2 non-crucial comparisons. For any time an N-element is compared, a non-crucial comparison is done (except at the very end, when a crucial comparison may be done with the median itself). The least number of comparisons with N-elements that can be done is (n-1)/2. The TOTAL number of comparisons is therefore n-1 + (n-1)/2 = (3/2)(n-1).

Summary We have been extremely lucky with the problems we have discussed. For most of other problems you will see in future, lower bounds, even the very weak ones, are extremely hard to get, if we can get them at all. We learned two powerful methods: Adversary argument Incompressibility method.