Presentation is loading. Please wait.

Presentation is loading. Please wait.

The Chernoff bound Speaker: Chuang-Chieh Lin Coworker: Chih-Chieh Hung

Similar presentations


Presentation on theme: "The Chernoff bound Speaker: Chuang-Chieh Lin Coworker: Chih-Chieh Hung"— Presentation transcript:

1 The Chernoff bound Speaker: Chuang-Chieh Lin Coworker: Chih-Chieh Hung
Advisor: Maw-Shang Chang National Chung-Cheng University and National Chiao-Tung University

2 Outline Introduction The Chernoff bound Markov Inequality
The Moment Generating Functions The Chernoff Bound for a Sum of Poisson Trials The Chernoff Bound for Special cases Set Balancing Problem References

3 Introduction Goal: The Chernoff bound can be used in the analysis on the tail of the distribution of the sum of independent random variables, with some extensions to the case of dependent or correlated random variables. Markov Inequality and Moment generating functions which we shall introduce will be greatly needed.

4 Math tool Professor Herman Chernoff’s bound,
Annal of Mathematical Statistics 1952

5 Chernoff bounds I n i t s m o g e r a l f , h C ® b u d - v X w : y
> P [ ] E q + T z p . A moment generating function 一開始大家並不熟悉E[e^{tX}],慢慢就會比較了解

6 Markov’s Inequality F o r a n y d m v i b l e X ¸ > , P [ ] · E : W
> , P [ ] E : W c u s M k I q t h f C j = ( ) 2 V 跟前面的 Chernoff bound 是不是很像?

7 Proof of the Chernoff bound
I t f o l w s d i r e c y m M a k v n q u : P [ X ] = E So, how to calculate this term?

8 Moment Generating Functions
X ( t ) = E [ e ] : T h i s f u n c o g a m b w r y d v l j T h e i t m o n f r . v X

9 Moment Generating Functions (cont’d)
W e c a n s i l y w h t m o g r f u k : d M X ( ) j = E [ P ]

10 Moment Generating Functions (cont’d)
: I f M X ( ) = Y o r l 2 ; s m e > , h n d v i b u . w p + L 1 k g T P y

11 Chernoff bound for the sum of Poisson trials
The distribution of a sum of independent 0-1 random variables, which may not be identical. Bernoulli trials: The same as above except that all the random variables are identical.

12 Chernoff bound for the sum of Poisson trials (cont’d)
X i : = 1 ; n m u t a l y d e p - r o v b s w h P [ ] . L + E ) M ( (Since 1 + y ≤ e y.) F M X ( t ) = E [ e ] 1 2 : n p + , s i c . We will use this result later.

13 Chernoff bound for the sum of Poisson trials (cont’d)
1 : L t X = + n , w ; a i d p l s u c P [ ] f 2 . ( ) y > 3 R 6

14 Proof of Theorem 1: B y M a r k o v i n e q u l t , f > w h P [ X ¸
d m v i - b l e X > , P [ ] E : B y M a r k o v i n e q u l t , f > w h P [ X ( 1 + d ) ] = E . F s T p 2 < 3 g m b c R 6 5 H (2)和(3)分別可由(1)以微積分的證明方式和簡單的代數運算推導而得, 所以重要的是要會推導(1)。 from (1)

15 Similarly, we have: T h e o r m L t X = P , w ; : a d p - s l u c [ ]
+ d X probability Similarly, we have: T h e o r m L t X = P n i 1 , w ; : a d p - s l u c [ ] . E f < ( ) 2 C y F j 3

16 Example: Let X be the number of heads of n independent fair coin flips
Example: Let X be the number of heads of n independent fair coin flips. Applying the above Corollary, we have: P r [ j X n = 2 p 6 l ] e x ( 1 3 ) : 4 B y C h b s v i q u a t , . E V w Better!!

17 Better bounds for special cases
h e o r m L t X = 1 + n , w ; : a i d p v b l s P [ ] 2 . F y > f E ( ) S c ! u g Q B

18 Better bounds for special cases (cont’d)
y L e t X = 1 + n , w h ; : i d p m v b s P [ ] 2 . F > j Y ( ) f g c

19 Better bounds for special cases (cont’d)
y L e t Y = 1 + n , w h ; : i d p m v b s P [ ] 2 . E ( ) F > 3 4 Note: The details can be left for exercises. (See [MU05], pp )

20 Exercise L e t X b a r n d o m v i l s u c h » G ( p ) . P C ® f , y
> [ ] K

21 An Application: Set Balancing
Given an n  m matrix A with entries in {0,1}, let Suppose that we are looking for a vector v with entries in {1, 1} that minimizes B @ a 1 2 : m . n C A v = c 當||Av||_{\infty}的值愈小,表示該feature的正和負差不多,那麼這分法(即v)對於這feature就是balance了 k A v 1 = m a x i ; : n j c

22 Set Balancing (cont’d)
The problem arises in designing statistical experiments. Each column of matrix A represents a subject in the experiment and each row represents a feature. The vector v partitions the subjects into two disjoint groups, so that each feature is roughly as balanced as possible between the two groups. 當||Av||_{\infty}的值愈小,表示該feature的正和負差不多,那麼這分法(即v)對於這feature就是balance了

23 Set Balancing (cont’d)
For example, A: v: 斑馬 老虎 鯨魚 企鵝 肉食性 1 陸生 哺乳類 產卵 1 1 A v : 1 2 1 希望對每個特性都能分得均勻。 W e o b t a i n h k A v 1 = 2 .

24 Set Balancing (cont’d)
: G v m r x A w h s o 1 , - d f ; u = . T F y q p P [ j 4 ] 2 randomly chosen A v m n c = n

25 Proof of Set Balancing:
d e t h - w A a = ( ; 1 m ) . S u p k I 4 l , c y j v > z Z b / 2 g + B [ ] P r [ n S i = 1 ( j Z > p 4 m l ) ] A v m n ai

26 References [MR95] Rajeev Motwani and Prabhakar Raghavan, Randomized algorithms, Cambridge University Press, 1995. [MU05] Michael Mitzenmacher and Eli Upfal, Probability and Computing - Randomized Algorithms and Probabilistic Analysis, Cambridge University Press, 2005. 蔡錫鈞教授上課投影片


Download ppt "The Chernoff bound Speaker: Chuang-Chieh Lin Coworker: Chih-Chieh Hung"

Similar presentations


Ads by Google