The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Final Part I Sections 2.2 -- 4.5 Jiaping Wang Department of Mathematics 02/29/2013, Monday.

Slides:



Advertisements
Similar presentations
Discrete Uniform Distribution
Advertisements

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Section 4.2. Expected Values of Random Variables Jiaping.
Jiaping Wang Department of Mathematical Science 02/13/2013, Monday
Probability Theory Part 1: Basic Concepts. Sample Space - Events  Sample Point The outcome of a random experiment  Sample Space S The set of all possible.
Week 21 Basic Set Theory A set is a collection of elements. Use capital letters, A, B, C to denotes sets and small letters a 1, a 2, … to denote the elements.
Economics 105: Statistics Any questions? Go over GH2 Student Information Sheet.
Chapter 2 Discrete Random Variables
Discrete Probability Distributions
Introduction to stochastic process
Discrete Random Variables and Probability Distributions
Chapter 2: Probability.
Probability Distributions
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved.
Probability Review ECS 152A Acknowledgement: slides from S. Kalyanaraman & B.Sikdar.
Discrete Random Variables and Probability Distributions
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
The Erik Jonsson School of Engineering and Computer Science © Duncan L. MacFarlane Probability and Stochastic Processes Yates and Goodman Chapter 1 Summary.
The Erik Jonsson School of Engineering and Computer Science Chapter 1 pp William J. Pervin The University of Texas at Dallas Richardson, Texas
McGraw-Hill/IrwinCopyright © 2009 by The McGraw-Hill Companies, Inc. All Rights Reserved. Chapter 4 and 5 Probability and Discrete Random Variables.
1 CY1B2 Statistics Aims: To introduce basic statistics. Outcomes: To understand some fundamental concepts in statistics, and be able to apply some probability.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 5 Discrete Random Variables and Probability Distributions ©
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam 2 Sections 4.6 – 5.6 Jiaping Wang Department of Mathematical Science 04/01/2013, Monday.
10/1/20151 Math a Sample Space, Events, and Probabilities of Events.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.3, 4.4. Bernoulli and Binomial Distributions Jiaping.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 2. Foundations of Probability Section 2.3. Definition of Probability Jiaping Wang Department of.
Dr. Gary Blau, Sean HanMonday, Aug 13, 2007 Statistical Design of Experiments SECTION I Probability Theory Review.
Chapter 1 Probability Spaces 主講人 : 虞台文. Content Sample Spaces and Events Event Operations Probability Spaces Conditional Probabilities Independence of.
Basic Concepts of Discrete Probability (Theory of Sets: Continuation) 1.
Probability Notes Math 309. Sample spaces, events, axioms Math 309 Chapter 1.
1 2. Independence and Bernoulli Trials Independence: Events A and B are independent if It is easy to show that A, B independent implies are all independent.
Engineering Probability and Statistics Dr. Leonore Findsen Department of Statistics.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review Instructor: Anirban Mahanti Office: ICT Class.
 Review Homework Chapter 6: 1, 2, 3, 4, 13 Chapter 7 - 2, 5, 11  Probability  Control charts for attributes  Week 13 Assignment Read Chapter 10: “Reliability”
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 4. Discrete Probability Distributions Sections 4.7, 4.8: Poisson and Hypergeometric Distributions.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
Probability & Statistics I IE 254 Exam I - Reminder  Reminder: Test 1 - June 21 (see syllabus) Chapters 1, 2, Appendix BI  HW Chapter 1 due Monday at.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Exam I Sections Jiaping Wang Department of Mathematical Science 02/18/2013, Monday.
Probability theory Petter Mostad Sample space The set of possible outcomes you consider for the problem you look at You subdivide into different.
Discrete Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4)
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Week 21 Conditional Probability Idea – have performed a chance experiment but don’t know the outcome (ω), but have some partial information (event A) about.
Expectation for multivariate distributions. Definition Let X 1, X 2, …, X n denote n jointly distributed random variable with joint density function f(x.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Chapter 2. Foundations of Probability Section 2.2. Sample Space and Events Jiaping Wang Department of Mathematical.
Probability: Terminology  Sample Space  Set of all possible outcomes of a random experiment.  Random Experiment  Any activity resulting in uncertain.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Random Variables Example:
Discrete Random Variables. Introduction In previous lectures we established a foundation of the probability theory; we applied the probability theory.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
Lecture 7 Dustin Lueker.  Experiment ◦ Any activity from which an outcome, measurement, or other such result is obtained  Random (or Chance) Experiment.
Welcome to MM305 Unit 3 Seminar Prof Greg Probability Concepts and Applications.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Copyright © 2011 by The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Chapter 5 Discrete Random Variables.
Basic Probability. Introduction Our formal study of probability will base on Set theory Axiomatic approach (base for all our further studies of probability)
Section 7.3. Why we need Bayes?  How to assess the probability that a particular event occurs on the basis of partial evidence.  The probability p(F)
L Basic Definitions: Events, Sample Space, and Probabilities l Basic Rules for Probability l Conditional Probability l Independence of Events l Combinatorial.
MAT 446 Supplementary Note for Ch 3
Math a - Sample Space - Events - Definition of Probabilities
What is Probability? Quantification of uncertainty.
Chapter 2 Discrete Random Variables
Chapter 5 Statistical Models in Simulation
Probability Models Section 6.2.
Probability Notes Math 309.
Discrete Random Variables and Probability Distributions
Probability Notes Math 309 August 20.
Presentation transcript:

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Review of Final Part I Sections Jiaping Wang Department of Mathematics 02/29/2013, Monday

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Outline Sample Space and Events Definition of Probability Counting Rules Conditional Probability and Independence Probability Distribution and Expected Values Bernoulli, Binomial and Geometric Distributions Negative Binomial, Poisson, Hypergeometric Distributions and MGF

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 1. Sample Space and Events

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 2.1 A sample space S is a set that includes all possible outcomes for a random experiment listed in a mutually exclusive and exhaustive way. Mutually Exclusive means the outcomes of the set do not overlap. Exhaustive means the list contains all possible outcomes. Definition 2.1 A sample space S is a set that includes all possible outcomes for a random experiment listed in a mutually exclusive and exhaustive way. Mutually Exclusive means the outcomes of the set do not overlap. Exhaustive means the list contains all possible outcomes. Definition 2.2: An event is any subset of a sample space.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL There are three operators between events: Intersection: ∩ --- A∩B or AB – a new event consisting of common elements from A and B Union: U --- AUB – a new event consisting of all outcomes from A or B. Complement: ¯, A, -- a subset of all outcomes in S that are not in A. There are three operators between events: Intersection: ∩ --- A∩B or AB – a new event consisting of common elements from A and B Union: U --- AUB – a new event consisting of all outcomes from A or B. Complement: ¯, A, -- a subset of all outcomes in S that are not in A. Event Operators and Venn Diagram AUB A∩B A A S S S

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Commutative laws: Associate laws: Distributive laws: DeMorgan’s laws: Some Laws

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 2. Definition of Probability

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Suppose that a random experiment has associated with a sample space S. A probability is a numerically valued function that assigned a number P(A) to every event A so that the following axioms hold: (1) P(A) ≥ 0 (2) P(S) = 1 (3) If A 1, A 2, … is a sequence of mutually exclusive events (that is A i ∩A j =ø for any i≠j), then Suppose that a random experiment has associated with a sample space S. A probability is a numerically valued function that assigned a number P(A) to every event A so that the following axioms hold: (1) P(A) ≥ 0 (2) P(S) = 1 (3) If A 1, A 2, … is a sequence of mutually exclusive events (that is A i ∩A j =ø for any i≠j), then

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL 2. 0≤ P(A) ≤1for any event A. 3. P(AUB) = P(A) + P(B) if A and B are mutually exclusively. Some Basic Properties 1. P( ø ) = 0, P(S) = If A is a subset of B, then P(A) ≤ P(B). 4. P(AUB) = P(A) + P(B) – P(A∩B) for general events A and B. 6. P(A) = 1 – P(A). 7. P(A∩B) = P(A) – P(A∩B).

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 2.1. For events A 1, A 2, …, A n from the sample space S, We can use induction to prove this. Theorem 2.1. For events A 1, A 2, …, A n from the sample space S, We can use induction to prove this. Inclusive-Exclusive Principle

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Determine the Probability Values The definition of probability only tells us the axioms that the probability function must obey; it doesn’t tell us what values to assign to specific event. For example, if a die is balanced, then we may think P(A i )=1/6 for A i ={ i }, i = 1, 2, 3, 4, 5, 6 For example, if a die is balanced, then we may think P(A i )=1/6 for A i ={ i }, i = 1, 2, 3, 4, 5, 6 The value of the probability is usually based on empirical evidence or on careful thought about the experiment. However, if a die is not balanced, to determine the probability, we need run lots of experiments to find the frequencies for each outcome.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 3. Counting Rules

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Fundamental Principle of Counting: If the first task of an experiment can result in n 1 possible outcomes and for each such outcome, the second task can result in n 2 possible outcomes, then there are n 1 n 2 possible outcomes for the two tasks together. Fundamental Principle of Counting: If the first task of an experiment can result in n 1 possible outcomes and for each such outcome, the second task can result in n 2 possible outcomes, then there are n 1 n 2 possible outcomes for the two tasks together. Theorem 2.2 The principle can extend to more tasks in a sequence.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Order Is ImportantOrder Is Not Important With Replacementnrnr C r n+r-1 Without Replacement PrnPrn CrnCrn Order and Replacement

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 2.5 Partitions Consider a case: If we roll a die for 12 times, how many possible ways to have 2 1’s, 2 2’s, 3 3’s, 2 4’s, 2 5’s and 1 6’s? Solution: First, choose 2 1’s from 12 which gives 12!/(2!10!), second, since there are two positions are filled by 1’s, the next choice appears in the left 10 positions, so there are 10!/(8!2!) ways, and so similar for next other selections which provides the final result is 12!/(2!10!)x10!/(2!8!)x8!/(3!5!)x5!/(2!3!)x3!/(2!1!)x1!/(1!0!) =12!/(2!x2!x3!x2!x2!x1!) Theorem 2.5 Partitions. The number of partitioning n distinct objects into k groups containing n 1, n 2,, n k objects, respectively, is

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 4. Conditional Probability and Independence

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 3.1 If A and B are any two events, then the conditional probability of A given B, denoted as P(A|B), is Provided that P(B)>0. Notice that P(A ∩B) = P(A|B)P(B) or P(A∩B) = P(B|A)P(A). If A and B are any two events, then the conditional probability of A given B, denoted as P(A|B), is Provided that P(B)>0. Notice that P(A ∩B) = P(A|B)P(B) or P(A∩B) = P(B|A)P(A). This definition also follows the three axioms of probability. (1)A∩B is a subset of B, so P(A∩B )≤P(B), then 0≤P(A|B)≤1; (2)P(S|B)=P(S∩B)/P(B)=P(B)/P(B)=1; (3)If A 1, A 2, …, are mutually exclusively, then so are A 1 ∩B, A 2 ∩B, …; and P(UA i |B) = P((UA i ) ∩B)/P(B)=P(U(A i ∩B)/P(B)=∑P(A i ∩B)/P(B)= ∑P(A i |B).

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 3.2: Multiplicative Rule. If A and B are any two events, then P(A∩B) = P(A)P(B|A) = P(B)P(A|B) If A and B are independent, then P(A∩B) = P(A)P(B). Theorem 3.2: Multiplicative Rule. If A and B are any two events, then P(A∩B) = P(A)P(B|A) = P(B)P(A|B) If A and B are independent, then P(A∩B) = P(A)P(B). Definition 3.2 and Theorem 3.2 Definition 3.2: Two events A and B are said to be independent if P(A∩B)=P(A)P(B). This is equivalent to stating that P(A|B)=P(A), P(B|A)=P(B) If the conditional probability exist. Definition 3.2: Two events A and B are said to be independent if P(A∩B)=P(A)P(B). This is equivalent to stating that P(A|B)=P(A), P(B|A)=P(B) If the conditional probability exist.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem of Total Probability: If B 1, B 2, …, B k is a collection of mutually exclusive and exhaustive events, then for any event A, we have Theorem of Total Probability: If B 1, B 2, …, B k is a collection of mutually exclusive and exhaustive events, then for any event A, we have Bayes’ Rule. If the events B1, B2, …, Bk form a partition of the sample space S, and A is any event in S, then Bayes’ Rule. If the events B1, B2, …, Bk form a partition of the sample space S, and A is any event in S, then

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 5. Probability Distribution and Expected Value

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL A random variable X is said to be discrete if it can take on only a finite number – or a countably infinite number – of possible values x. The probability function of X, denoted by p(x), assigns probability to each value x of X so that the following conditions hold: 1.P(X=x)=p(x)≥0; 2.∑ P(X=x) =1, where the sum is over all possible values of x. A random variable X is said to be discrete if it can take on only a finite number – or a countably infinite number – of possible values x. The probability function of X, denoted by p(x), assigns probability to each value x of X so that the following conditions hold: 1.P(X=x)=p(x)≥0; 2.∑ P(X=x) =1, where the sum is over all possible values of x. A random variable is a real-valued function whose domain is a sample space. A random variable is a real-valued function whose domain is a sample space.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL The distribution function F(b) for a random variable X is F(b)=P(X ≤ b); If X is discrete, Where p(x) is the probability function. The distribution function is often called the cumulative distribution function (CDF). The distribution function F(b) for a random variable X is F(b)=P(X ≤ b); If X is discrete, Where p(x) is the probability function. The distribution function is often called the cumulative distribution function (CDF). Any function satisfies the following 4 properties is a distribution function: The distribution function is a non-decreasing function: if a<b, then F(a)≤ F(b). The distribution function can remain constant, but it can’t decrease as we increase from a to b. 4. The distribution function is right-hand continuous: Any function satisfies the following 4 properties is a distribution function: The distribution function is a non-decreasing function: if a<b, then F(a)≤ F(b). The distribution function can remain constant, but it can’t decrease as we increase from a to b. 4. The distribution function is right-hand continuous:

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definition 4.4 The expected value of a discrete random variable X with probability distribution p(x) is given as (The sum is over all values of x for which p(x)>0) We sometimes use the notation E(X)=μ for this equivalence. Definition 4.4 The expected value of a discrete random variable X with probability distribution p(x) is given as (The sum is over all values of x for which p(x)>0) We sometimes use the notation E(X)=μ for this equivalence. Definition 4.4 Note: Not all expected values exist, the sum above must converge absolutely, ∑|x|p(x)<∞. Theorem 4.1 If X is a discrete random variable with probability p(x) and if g(x) is any real-valued function of X, then E(g(x))=∑g(x)p(x).

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Definitions 4.5 and 4.6 The variance of a random variable X with expected value μ is given by V(X)=E[(X- μ) 2 ] Sometimes we use the notation σ 2 = E[(X- μ) 2 ] For this equivalence. The variance of a random variable X with expected value μ is given by V(X)=E[(X- μ) 2 ] Sometimes we use the notation σ 2 = E[(X- μ) 2 ] For this equivalence. The standard deviation is a measure of variation that maintains the original units of measure. The standard deviation of a random variable is the square root of the variance and is given by

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Theorem 4.2 For any random variable X and constants a and b. 1.E(aX + b) = aE(X) + b 2. V(aX + b) = a 2 V(X) Theorem 4.2 For any random variable X and constants a and b. 1.E(aX + b) = aE(X) + b 2. V(aX + b) = a 2 V(X) Standardized random variable: If X has mean μ and standard deviation σ, then Y=(X – μ)/ σ has E(Y)=0 and V(Y)=1, thus Y can be called the standardized random variable of X. Standardized random variable: If X has mean μ and standard deviation σ, then Y=(X – μ)/ σ has E(Y)=0 and V(Y)=1, thus Y can be called the standardized random variable of X. Theorem 4.3 If X is a random variable with mean μ, then V(X)= E(X 2 ) – μ 2 Theorem 4.3 If X is a random variable with mean μ, then V(X)= E(X 2 ) – μ 2 Tchebysheff’s Theorem. Let X be a random variable with mean μ and standard deviation σ. Then for any positive k, P(|X – μ|/ σ < k) ≥ 1-1/k 2 Tchebysheff’s Theorem. Let X be a random variable with mean μ and standard deviation σ. Then for any positive k, P(|X – μ|/ σ < k) ≥ 1-1/k 2

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 6. Bernoulli, Binomial and Geometric Distribution

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Let the probability of success is p, then the probability of failure is 1-p, the distribution of X is given by p(x)=p x (1-p) 1-x, x=0 or 1 Where p(x) denotes the probability that X=x. E(X) = ∑xp(x) = 0p(0)+1p(1)=0(1-p)+p= p  E(X)=p V(X)=E(X 2 )-E 2 (X)= ∑x 2 p(x) –p 2 =0(1-p)+1(p)-p 2 =p-p 2 =p(1-p)  V(X)=p(1-p) Let the probability of success is p, then the probability of failure is 1-p, the distribution of X is given by p(x)=p x (1-p) 1-x, x=0 or 1 Where p(x) denotes the probability that X=x. E(X) = ∑xp(x) = 0p(0)+1p(1)=0(1-p)+p= p  E(X)=p V(X)=E(X 2 )-E 2 (X)= ∑x 2 p(x) –p 2 =0(1-p)+1(p)-p 2 =p-p 2 =p(1-p)  V(X)=p(1-p) Bernoulli Distribution

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Binomial Distribution Suppose we conduct n independent Bernoulli trials, each with a probability p of success. Let the random variable X be the number of successes in these n trials. The distribution of X is called binomial distribution. Let Y i = 1 if ith trial is a success = 0 if ith trial is a failure, Then X=∑ Y i denotes the number of the successes in the n independent trials. So X can be {0, 1, 2, 3, …, n}. For example, when n=3, the probability of success is p, then what is the probability of X?

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Cont. A random variable X is a binomial distribution if 1. The experiment consists of a fixed number n of identical trials. 2. Each trial only have two possible outcomes, that is the Bernoulli trials. 3. The probability p is constant from trial to trial. 4. The trials are independent. 5. X is the number of successes in n trails. A random variable X is a binomial distribution if 1. The experiment consists of a fixed number n of identical trials. 2. Each trial only have two possible outcomes, that is the Bernoulli trials. 3. The probability p is constant from trial to trial. 4. The trials are independent. 5. X is the number of successes in n trails.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL

The geometric distribution function: P(X=x)=p(x)=(1-p) x p=q x p, x= 0, 1, 2, …., q=1-p The geometric distribution function: P(X=x)=p(x)=(1-p) x p=q x p, x= 0, 1, 2, …., q=1-p Geometric Distribution: Probability Function P(X=x) = q x p = p[q x-1 p] = qP(X=x-1) <P(X=x-1) as q ≤ 1, for x=1, 2, … A Geometric Distribution Function with p=0.5

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Geometric Series and CDF

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Mean and Variance

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Part 7. Part 7. Negative Binomial, Poisson, Hypergeometric Distributions and MGF

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Negative Binomial Distribution If r=1, then the negative binomial distribution becomes the geometric distribution. What if we were interested in the number of failures prior to the second success, or the third success or (in general) the r-th success? Let X denote the number of failures prior to the r-th success, p denotes the common probability. What if we were interested in the number of failures prior to the second success, or the third success or (in general) the r-th success? Let X denote the number of failures prior to the r-th success, p denotes the common probability.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Poisson Distribution Recall that λ denotes the mean number of occurrences in one time period, if there are t non-overlapped time periods, then the mean would be λt. Poisson distribution is often referred to as the distribution of rare events. E(X)= V(X) = λfor Poisson random variable.

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Hypergeometric Distribution Now we consider a general case: Suppose a lot consists of N items, of which k are of one type (called successes) and N-k are of another type (called failures). Now n items are sampled randomly and sequentially without replacement. Let X denote the number of successes among the n sampled items. So What is P(X=x) for some integer x?

The UNIVERSITY of NORTH CAROLINA at CHAPEL HILL Moment Generating Function It often is easier to evaluate M(t) and its derivatives than to find the moments of the random variable directly.