CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics. 2007 Slides by Michael Maurizi Instructor Longin Jan Latecki C9:

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C3: Conditional Probability And.
Multivariate Distributions
5.4 Joint Distributions and Independence
1 Def: Let and be random variables of the discrete type with the joint p.m.f. on the space S. (1) is called the mean of (2) is called the variance of (3)
Statistics Lecture 18. Will begin Chapter 5 today.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
MA 102 Statistical Controversies Monday, April 1, 2002 Today: Randomness and probability Probability models and rules Reading (for Wednesday): Chapter.
CSE 221: Probabilistic Analysis of Computer Systems Topics covered: Discrete random variables Probability mass function Distribution function (Secs )
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
G. Cowan Lectures on Statistical Data Analysis Lecture 2 page 1 Statistical Data Analysis: Lecture 2 1Probability, Bayes’ theorem 2Random variables and.
Probability Distributions Finite Random Variables.
Statistics Lecture 20. Last Day…completed 5.1 Today Section 5.2 Next Day: Parts of Section 5.3 and 5.4.
Class notes for ISE 201 San Jose State University
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C12: The Poisson process.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Distribution Function properties. Density Function – We define the derivative of the distribution function F X (x) as the probability density function.
Joint Distribution of two or More Random Variables
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Joint Probability Distributions Leadership in Engineering
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics, 2007 Instructor Longin Jan Latecki Chapter 7: Expectation and variance.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
: Appendix A: Mathematical Foundations 1 Montri Karnjanadecha ac.th/~montri Principles of.
Discrete Random Variables: PMFs and Moments Lemon Chapter 2 Intro to Probability
MATH 3033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Tim Birbeck Instructor Longin Jan Latecki C2: Outcomes,
Jointly Distributed Random Variables
1 7. Two Random Variables In many experiments, the observations are expressible not as a single quantity, but as a family of quantities. For example to.
LECTURE IV Random Variables and Probability Distributions I.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C22: The Method of Least Squares.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Instructor Longin Jan Latecki C2: Outcomes, events, and probability.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Michael Baron. Probability and Statistics for Computer Scientists,
2.1 Introduction In an experiment of chance, outcomes occur randomly. We often summarize the outcome from a random experiment by a simple number. Definition.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
1 Topic 5 - Joint distributions and the CLT Joint distributions –Calculation of probabilities, mean and variance –Expectations of functions based on joint.
PROBABILITY AND STATISTICS FOR ENGINEERING Hossein Sameti Department of Computer Engineering Sharif University of Technology Two Random Variables.
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
CIS 2033 A Modern Introduction to Probability and Statistics Understanding Why and How Chapter 17: Basic Statistical Models Slides by Dan Varano Modified.
Topic 5 - Joint distributions and the CLT
President UniversityErwin SitompulPBST 3/1 Dr.-Ing. Erwin Sitompul President University Lecture 3 Probability and Statistics
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Copyright © Cengage Learning. All rights reserved. 5 Joint Probability Distributions and Random Samples.
5 pair of RVs.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
6 vector RVs. 6-1: probability distribution A radio transmitter sends a signal to a receiver using three paths. Let X1, X2, and X3 be the signals that.
Chapter 9: Joint distributions and independence CIS 3033.
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics B: Michael Baron. Probability and Statistics for Computer Scientists,
C4: DISCRETE RANDOM VARIABLES
Computer vision: models, learning and inference
CIS 2033 based on Dekking et al
C14: The central limit theorem
Chapter 10: Covariance and Correlation
CONDITIONAL PROBABILITY
ASV Chapters 1 - Sample Spaces and Probabilities
Introduction to Probability
Discrete Random Variables: Joint PMFs, Conditioning and Independence
Experiments, Outcomes, Events and Random Variables: A Revisit
Random Variables and Probability Distributions
Chapter 10: Covariance and Correlation
MATH 3033 based on Dekking et al
Chapter 10: Covariance and Correlation
Presentation transcript:

CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9: Joint Distributions and Independence

9.1 – Joint Distributions of Discrete Random Variables  Joint Distribution: the combined distribution of two or more random variables defined on the same sample space Ω  Joint Distribution of two discrete random variables: The joint distribution of two discrete random variables X and Y can be obtained by using the probabilities of all possible values of the pair (X,Y) Joint Probability Mass function p of two discrete random variables X and Y: Joint Distribution function F of two random variables X and Y: Can be thought of as the sum of the elements in box it makes with the upper-left corner.

9.1 – Joint Distributions of Discrete Random Variables Example: Joint Distribution of S and M. S = The sum of two dice, M = The maximum of two dice. b pS(a)pS(a) a / / /362/360003/ /36 004/ /362/36 05/ /36 6/ /362/36 5/ /36 4/ /362/363/ / /36 pM(b)pM(b) 3/365/367/369/3611/361 Quick exercise 9.1 List the elements of the event {S = 7,M = 4} and compute its probability.

9.1 – Joint Distributions of Discrete Random Variables Example: Joint Distribution of S and M. S = The sum of two dice, M = The maximum of two dice. b pS(a)pS(a) a / / /362/360003/ /36 004/ /362/36 05/ /36 6/ /362/36 5/ /36 4/ /362/363/ / /36 pM(b)pM(b) 3/365/367/369/3611/361 Quick exercise 9.1 List the elements of the event {S = 7,M = 4} and compute its probability. The only possibilities with the sum equal to 7 and the maximum equal to 4 are the combinations (3, 4) and (4, 3). They both have probability 1/36, so that P(S = 7,M = 4) = 2/36.

9.1 – Marginal Distributions of Discrete Random Variables  Marginal Distribution: Obtained by adding up the rows or columns of a joint probability mass function table. Literally written in the margins.  Let p(a,b) be a joint pmf of RVs S and M. The marginal pmfs are then given by Example: Joint Distribution of S and M. S = The sum of two dice, M = The maximum of two dice. b pS(a)pS(a) a / / /362/360003/ /36 004/ /362/36 05/ /36 6/ /362/36 5/ /36 4/ /362/363/ / /36 pM(b)pM(b) 3/365/367/369/3611/361

9.1 – Joint Distributions of Discrete Random Variables: Examples Example: Joint Distribution of S and M. S = The sum of two dice, M = The maximum of two dice. b pS(a)pS(a) a / / /362/360003/ /36 004/ /362/36 05/ /36 6/ /362/36 5/ /36 4/ /362/363/ / /36 pM(b)pM(b) 3/365/367/369/3611/361 Compute joint distribution function F(5, 3).

9.4 – Independent Random Variables Tests for Independence: Two random variables X and Y are independent if and only if every event involving X is independent of every event involving Y.

Example 3.6 (Baron book) A program consists of two modules. The number of errors, X, in the first module and the number of errors, Y, in the second module have the joint distribution, P(0, 0) = P(0, 1) = P(1, 0) = 0.2, P(1, 1) = P(1, 2) = P(1, 3) = 0.1, P(0, 2) = P(0, 3) = Find (a) the marginal distributions of X and Y, (b) the probability of no errors in the first module, and (c) the distribution of the total number of errors in the program. Also, (d) find out if errors in the two modules occur independently.

Example 3. 6, p. 48, in Baron book

Table 4.2: (Baron book) Joint and marginal distributions in discrete and continuous cases.

9.2 – Joint Distributions of Continuous Random Variables  Joint Continuous Distribution: Like an ordinary continuous random variable, only works for a range of values. There must exist a function f that fulfills the following properties for there to be a joint continuous distribution: Marginal distribution function of X:Marginal distribution function of Y:

9.2 – Joint Distributions of Continuous Random Variables Joint distribution function: F(a,b) can be constructed given f(x,y), and vice versa Marginal probability density function: You need to integrate out the unwanted random variable to get the marginal distribution.

The marginal distribution function of X:

We can also determine f Y (y) directly from f(x,y) (Quick Exercise 9.5). For y between 1 and 2:

9.3 – More than Two Random Variables Assuming we have n random variables X 1, X 2, X 3, … X n. We can get the joint distribution function and the joint probability mass functions.

9.4 – Independent Random Variables Tests for Independence of more than two random variables.

9.5 – Propagation of Independence Independence after a change of variable: If a function is applied to several independent random variables, the new resulting random variables will also be independent.