Presentation is loading. Please wait.

Presentation is loading. Please wait.

Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1

Similar presentations


Presentation on theme: "Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1"— Presentation transcript:

1 Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1
c t d a y o i f r w l u s m g b n - . F o r a n y 2 - p e s u m g , w h v x i c T M = q ; t b l 1 . j k A d f 2018/11/29

2 Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1
c t d a y o i f r w l u s - g n m b q . Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i c T M = q ; t b l 1 . j k A d f 2018/11/29

3 Yao’s interpretation The row player = the maximizer = the adversary responsible for designing malicious inputs. The column player = the minimizer = the algorithm designer responsible for designing efficient algorithms. 2018/11/29

4 Pure strategies For the column player (minimizer)
Each pure strategy corresponds to a deterministic algorithm. For the row player (maximizer) Each pure strategy corresponds to a particular input instance. 2018/11/29

5 Mixed strategies For the column player (minimizer)
Each mixed strategy corresponds to a randomized algorithm. For the row player (maximizer) Each mixed strategy corresponds to a probability distribution over all the input instances. 2018/11/29

6 Yao’s interpretation for Loomis’ Theorem
( I ; A ) d n o h i m r q u f a l g p . b y s , w v x c E [ ] = : - 2018/11/29

7 Yao’s Inequality L e t ¦ b a p r o l m . D c n s i f d - g h v I u w y
, q V z R A 2 x : 2018/11/29

8 A comment The two different topics “probabilistic analysis for deterministic algorithms” and “randomized algorithms” interact by Yao’s Principle. 2018/11/29

9 How to use Yao’s Lemma? Task 1: Task 2:
Design a probability distribution p for the input instance. Task 2: Obtain a lower bound on the expected running for any deterministic algorithm running on Ip. 2018/11/29

10 牛刀小試 on Yao’s Principle
A lower bound (n0.694) on the expected running time of any Las Vegas algorithms for evaluating NOR circuits 2018/11/29

11 Task 1: designing Ip x1 x2 x3 x4 y x5 x6 x7 x8 2018/11/29

12 An Ip L e t a c h l f o N O R i r u b n - d p y s g v 1 w ¡ .
. 2018/11/29

13 Interestingly, t h e o u p f a c N O R g l s v 1 w i r b y ( ¡ ) = : A
2 = : A m , q n d . 2018/11/29

14 Depth-first evaluation
Recall that we can focus on deterministic depth-first evaluation algorithms. Let A be an arbitrary algorithm of this kind. Let W(k) be the time required for A to evaluate the circuit on Ip with n = 2k numbers. So, W(k) = W(k – 1) + (1 – p) W(k – 1), implying W(k) = ((2 – p)k) = (n0.694). 2018/11/29

15 Homework 2 P r o b l e m C n s i d a u f t h g { v y c . T , w E B j p
; F 1 S - ( ) = 3 2 z I x : 9 Y ' q ? 2018/11/29

16 Majority function x y z w 1 x y z w 2018/11/29

17 h-level majority circuit
2018/11/29

18 Median selection in O(n) time
2018/11/29

19 The problem definition
u t : a ( m l i e ) s X o f b r . O x w h 1 j y 2 < g z > 4 4 3 3 5 5 8 8 1 1 9 9 2 2 6 6 7 7 2018/11/29

20 How hard is median selection?
[Blum et al. STOC’72 & JCSS’73] A “shining” paper by five authors: Manuel Blum (Turing Award 1995) Robert W. Floyd (Turing Award 1978) Vaughan R. Pratt Ronald L. Rivest (Turing Award 2002) Robert E. Tarjan (Turing Award 1986) The number of comparisons required to find a median is between 1.5n and 5.43n. 2018/11/29

21 Number of comparisonss
Upper bound: 3n + o(n) by Schonhage, Paterson, and Pippenger (JCSS 1975). 2.95n by Dor and Zwick (SODA 1995, SIAM Journal on Computing 1999). Lower bound: 2n+o(n) by Bent and John (STOC 1985) (2+2-80)n by Dor and Zwick (FOCS 1996, SIAM Journal on Discrete Math 2001). 2018/11/29

22 Selection in O(n) time F W e s h o w t l c i - a r g n u m b f d O ( )
. A , v p T y 2 2018/11/29

23 (1) Five guys per group 2018/11/29

24 (2) A median per group 2018/11/29

25 (3) Median of medians (MoM)
2018/11/29

26 (4) Partition via MoM MoM > MoM < MoM X > X < 2018/11/29

27 (5) Recursion F I f i · j X , t h e n o u p - l a r g s m b . = + 1 M
> , t h e n o u p - l a r g s m b . = + 1 M ( ) < 2018/11/29

28 Two recursive steps Step (2) Determining MoM
Step (5) Selection in X< or X> 2018/11/29

29 The algorithm i n t s e l c ( X , ) f o r a h = 1 j 5 M [ ] b m d ¡ +
; 2 : x > < u p g 2018/11/29

30 The analysis i n t s e l c ( X , ) f o r a h = 1 j 5 M [ ] b m d ¡ + ;
2 : x > < u p g O ( n ) O ( n ) ? 2018/11/29

31 T(n) = running time of select(X, i) for |X| = n
W e h a v T ( n ) 5 + m x j X > ; < O : If we can show that the maximum is at most T(7n/10) …. W e h a v T ( n ) 5 + 7 1 O : 2018/11/29

32 Recurrence relation F r o m T ( n ) · ³ 5 ´ + µ 7 1 ¶ O ; e c a p v t
O ; e c a p v t h = : I u i S d g b l s z f 9 2018/11/29

33 The remaining question:
W e h a v T ( n ) 5 + m x j X > ; < O : Can we show that the maximum is at most T(7n/10)? 2018/11/29

34 See it from the picture? 2018/11/29

35 Deleting at least 3n/10 guys
2018/11/29

36 Conclusion Selecting the i-th largest number for any i can be done in linear time. 2018/11/29

37 Median selection by random sampling
The basic idea: random sampling The algorithm Math tools Analysis 2018/11/29

38 Median selection F I n p u t : a s e S o f d i c m b r . O h 5 ¢ - l x
5 - l x 2018/11/29

39 RandSelect RandSelect runs in expected O(n) time.
However, small expected running time does not always imply a small deviation for the running time. We usually prefer a stronger result like running in O(n) time with probability 1 – o(1). 2018/11/29

40 Goal: We will show that the random-sampling algorithm takes 1.5 n + o(n) comparisons with probability 1 – o(1). Exercise: This implies a Las Vegas algorithm whose expected number of comparisons is 1.5n + o(n). Significance: Better than the lower bound 2n [Bent and John, STOC 1985] on the number of comparisons required for any deterministic algorithms. 2018/11/29

41 Part 1: The basic idea Random sampling 2018/11/29

42 Step 1: sampling S R O r i g n a l p u t : b m e s . A s a m p l e o f
3 = 4 u b r . R I n t u i v e l y , w g o r p s f 1 = 4 m b . 2018/11/29

43 Step 2: sort R and find R’s median x in o(n) time.
l e o f n 3 = 4 u b r . R x 2018/11/29

44 Step 3: identify two nearby “guards” for x in the sorted R.
z 2018/11/29

45 Step 4: Partition S using y and z using 2n comparisons
h e s o r d R y x z L M H o p e f u l y , t h m d i a n x S s b c k w v z O ( 1 = 2 4 ) 3 . y z 2018/11/29

46 Step 5: Find the (0.5n-|L|)-th smallest number in M
o r d R y x z L M H o p e f u l y , t h m d i a n x S s b c k w v z O ( 1 = 2 4 ) 3 . y z 2018/11/29

47 The algorithm: SampSelect
Part 2 The algorithm: SampSelect 2018/11/29

48 Assumption for brevity:
The algorithm Assumption for brevity: 0.5n, 0.5 n3/4, and n0.5 are integers. a l g o r i t h m S p e c ( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w 2018/11/29

49 Comment 1 sampling with replacement
( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w R is a multiple set, although S is not. 2018/11/29

50 Reason: By |R| · n3/4, sorting R takes o(n) time.
Comment 2 Runs in o(n) time a l g o r i t h m S p e c ( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w Reason: By |R| · n3/4, sorting R takes o(n) time. 2018/11/29

51 Comment 3 At most 2n comparisons. a l g o r i t h m S p e c ( ) f b n
y 3 = 4 u d s ; : 5 - + z L 2 j < M w 2018/11/29

52 Reason: By |M| · 4n3/4, sorting M takes o(n) time.
Comment 4 Runs in o(n) time. a l g o r i t h m S p e c ( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w Reason: By |M| · 4n3/4, sorting M takes o(n) time. 2018/11/29

53 The algorithm may run forever.
Comment 5 The algorithm may run forever. a l g o r i t h m S p e c ( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w 2018/11/29

54 The dominating steps At most 2n comparisons. a l g o r i t h m S p e c
( ) f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w At most 2n comparisons. 2018/11/29

55 Our objective We will show that the random-sampling algorithm takes 1.5 n + o(n) comparisons with probability 1 – o(1). Task 1: With probability 1 – O(n–1/4), RandSelect requires only one iteration. Therefore, 2n + o(n) comparisons with probability 1 – o(1). Task 2: A clever implementation of SampSelect further reduces the number of comparisons to 1.5n + o(n) with probability 1 – o(1). 2018/11/29

56 Part 3 The math tools 2018/11/29

57 瑞士數學家Jacob Bernoulli,證出大數法則。祖孫三代出了十幾個數學家與物理學家(e.g., Daniel Bernoulli)
Bernoulli trial L e t X b a i n r y d o m v l f u 1 ( s p c , ) w h - . E [ ] = P V 2 瑞士數學家Jacob Bernoulli,證出大數法則。祖孫三代出了十幾個數學家與物理學家(e.g., Daniel Bernoulli) 2018/11/29

58 Bernoulli process L e t X = P w h r a c 1 · s d p b y o m v l f u . E
[ ] : V ( ) 2018/11/29

59 Chebyshev’s Inequality
X b a r n d o m v i l w h E [ ] = V 2 . T , f y p s u P j 1 : 十九世紀俄國數學家 2018/11/29

60 Part 4 The analysis Always the most challenging part for
designing a randomized algorithm… Part 4 The analysis 2018/11/29

61 The algorithm Goal: the condition holds with probability 1-O(n – 1/4)
f b n R y 3 = 4 u d s ; : 5 - + z L 2 j < M w 2018/11/29

62 Two events F E v e n t 1 : j L · 5 ¢ < + M . 2 4 I T h a i s , m d
5 < + M . I T h a i s , m d o f S 2 4 3 = r g l c p k ! 2018/11/29

63 Step 4: Partition S using y and z using 2n comparisons
h e s o r d R y x z L M H o p e f u l y , t h m d i a n x S s b c k w v z O ( 1 = 2 4 ) 3 . y z 2018/11/29

64 Task 1: Pr[E1 and E2] = 1-O(n-1/4)
F E v e n t 1 : j L 5 < + M . I T h a i s , m d o f S 2 4 3 = r g l c p k ! u w P [ ] O ( ) ; 2018/11/29

65 : Event 1 ≡ x* 2 L or x* 2 S – (L [ M)
p n p n t h e s o r d R y x z L y z M I t s u c e o h w P r [ x 2 L ] = O ( n 1 4 ) a d S M , q l i y b p v m . 2018/11/29

66 : Event 1a ≡ x*2 L x 2 L ´ < y a t l e s : 5 ¢ n + u m b r o f R i
p n p n t h e s o r d R y x z L x* y z M x 2 L < y a t l e s : 5 n 3 = 4 + 1 u m b r o f R i h 2018/11/29

67 x 2 L ´ < y a t l e s : 5 ¢ n + u m b r o f R i h E a c h n u m b e
2 L < y a t l e s : 5 n 3 = 4 + 1 u m b r o f R i h E a c h n u m b e r o f R s p i l t y . 5 x L X ( g ) B , w v [ ] = : 3 4 ; V 2 2018/11/29

68 Using Chebyshev’s Inequality
F o r a n y p s i t v e l u m b , w h P j X : 5 3 = 4 8 1 2 T g . 2018/11/29

69 Pr[: Event 1a] = O(n-1/4). B y P r h j X ¡ : 5 n ¸ i · ¢ ; w e a v [ E
: 5 n 3 = 4 1 2 i ; w e a v [ E t ] + O ( ) 2018/11/29

70 Pr[: Event 1b] = Pr[x*2 S – (L [ M)] = O(n-1/4).
h e s o r d R y x z L x* y z M T h i s e q u a l t y c n b p r o v d m . 2018/11/29

71 Task 1: Pr[E1 and E2] = 1-O(n-1/4)
F E v e n t 1 : j L 5 < + M . I T h a i s , m d o f S 2 4 3 = r g l c p k ! u w P [ ] O ( ) ; Done! 2018/11/29

72 Event 2 ≡ |M| · 4n3/4 Let y* be the (0.5n – 2n3/4)-th smallest number in S. Let z* be the (0.5n + 2n3/4)-th smallest number in S. 2 n 3 = 4 2 n 3 = 4 t h e s o r d S y x z 2018/11/29

73 : Event 2 ≡ |M|> 4 n3/4  y < y* or z > z*.
= 4 2 n 3 = 4 t h e s o r d S y x z L y z M I t s u c e o h w P r [ y < ] = O ( n 1 4 ) a d z > , q l i b p v m . 2018/11/29

74 : Event 2a ≡ y < y* y < ´ a t m o s : 5 ¢ n + u b e r f R i h t
3 = 4 2 n 3 = 4 t h e s o r d S y x z p n p n t h e s o r d R y x z y < a t m o s : 5 n 3 = 4 + 1 2 u b e r f R i h 2018/11/29

75 E a c h n u m b e r o f R s p i l t y : 5 + 2 . L Y ( g - ) B , w v [
< a t m o s : 5 n 3 = 4 + 1 2 u b e r f R i h E a c h n u m b e r o f R s p i l t y : 5 + 2 1 = 4 . L Y ( g - ) B , w v [ ] 3 ; V 2018/11/29

76 Using Chebyshev’s Inequality
F o r a n y p s i t v e l u m b , w h P j Y 3 = 4 ( : 5 + 2 1 ) V [ ] T g . W ; c 2018/11/29

77 Pr[: Event 2a] = O(n-1/4). B y P r h Y · : 5 n + i V a [ ] ; w e k o t
: 5 n 3 = 4 + 1 2 i V a [ ] ; w e k o t E v < O ( ) 2018/11/29

78 Pr[: Event 2b] = Pr[z > z*] = O(n-1/4)
3 = 4 2 n 3 = 4 t h e s o r d S y x z p n p n t h e s o r d R y x z T h i s e q u a l t y c n b p r o v d m . 2018/11/29

79 Task 1: Pr[E1 and E2] = 1-O(n-1/4)
F E v e n t 1 : j L 5 < + M . I T h a i s , m d o f S 2 4 3 = r g l c p k ! u w P [ ] O ( ) ; Done! 2018/11/29

80 Our objective We will show that the random-sampling algorithm takes 1.5 n + o(n) comparisons with probability 1 – o(1). Task 1: With probability 1 – O(n–1/4), RandSelect requires only one iteration. Therefore, 2n + o(n) comparisons with probability 1 – o(1). Task 2: A clever implementation of SampSelect further reduces the number of comparisons to 1.5n + o(n) with probability 1 – o(1). Done! 2018/11/29

81 Question How do we further reduce the number of comparisons down to 1.5n + o(n)? 2018/11/29

82 The dominating steps a l g o r i t h m S p e c ( ) f b n R µ y u d s ;
3 = 4 u d s ; : 5 - + z L 2 j < M w 2018/11/29

83 The trick To determine which “block” a number s of S belongs to, we randomly compare s to y or z first with probability ½. If we are lucky enough (i.e., s < y or s > z), then one comparison suffices. Only the O(n^{3/4}) numbers in M always have “bad luck”. All the other numbers are lucky with probability ½. Therefore, the expected number of comparisons becomes 1.5n + o(n) if Event 2 occurs. 2018/11/29

84 Making a stronger statement
Exercise: If Event 2 happens, then with probability 1-o(1) the number of comparisons using “the trick” is at most 1.5n + o(n). 2018/11/29

85 Conclusion Done! We will show that the random-sampling algorithm takes 1.5 n + o(n) comparisons with probability 1 – o(1). Exercise: This implies a Las Vegas algorithm whose expected number of comparisons is 1.5n + o(n). Significance: Better than the lower bound 2n [Bent and John, STOC 1985] on the number of comparisons required for any deterministic algorithms. 2018/11/29


Download ppt "Loomis’ Theorem F o r a n y 2 - p e s u m g , w h v x i M = ; t b l 1"

Similar presentations


Ads by Google