Chapter 9: Joint distributions and independence CIS 3033.

Slides:



Advertisements
Similar presentations
Let X 1, X 2,..., X n be a set of independent random variables having a common distribution, and let E[ X i ] = . then, with probability 1 Strong law.
Advertisements

Review of Probability. Definitions (1) Quiz 1.Let’s say I have a random variable X for a coin, with event space {H, T}. If the probability P(X=H) is.
Joint and marginal distribution functions For any two random variables X and Y defined on the same sample space, the joint c.d.f. is For an example, see.
AP Statistics Chapter 7 Notes. Random Variables Random Variable –A variable whose value is a numerical outcome of a random phenomenon. Discrete Random.
Chapter 4 Probability and Probability Distributions
Chapter 6: Probability : The Study of Randomness “We figured the odds as best we could, and then we rolled the dice.” US President Jimmy Carter June 10,
CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Slides by Michael Maurizi Instructor Longin Jan Latecki C9:
Sections 4.1 and 4.2 Overview Random Variables. PROBABILITY DISTRIBUTIONS This chapter will deal with the construction of probability distributions by.
Probability Theory STAT 312 STAT 312 Dr. Zakeia AlSaiary.
Probability Notation Review Prior (unconditional) probability is before evidence is obtained, after is posterior or conditional probability P(A) – Prior.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Today Today: More of Chapter 2 Reading: –Assignment #2 is up on the web site – –Please read Chapter 2 –Suggested.
Chapter 6 Continuous Random Variables and Probability Distributions
Class notes for ISE 201 San Jose State University
Samples vs. Distributions Distributions: Discrete Random Variable Distributions: Continuous Random Variable Another Situation: Sample of Data.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Probability theory 2010 Conditional distributions  Conditional probability:  Conditional probability mass function: Discrete case  Conditional probability.
The moment generating function of random variable X is given by Moment generating function.
C4: DISCRETE RANDOM VARIABLES CIS 2033 based on Dekking et al. A Modern Introduction to Probability and Statistics Longin Jan Latecki.
Chapter 4: Joint and Conditional Distributions
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Joint Probability distribution
5-1 Two Discrete Random Variables Example Two Discrete Random Variables Figure 5-1 Joint probability distribution of X and Y in Example 5-1.
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
Chapter6 Jointly Distributed Random Variables
Sampling Distributions  A statistic is random in value … it changes from sample to sample.  The probability distribution of a statistic is called a sampling.
Pairs of Random Variables Random Process. Introduction  In this lecture you will study:  Joint pmf, cdf, and pdf  Joint moments  The degree of “correlation”
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Sets, Combinatorics, Probability, and Number Theory Mathematical Structures for Computer Science Chapter 3 Copyright © 2006 W.H. Freeman & Co.MSCS SlidesProbability.
Discrete Random Variables: PMFs and Moments Lemon Chapter 2 Intro to Probability
Jointly Distributed Random Variables
Notes – Chapter 17 Binomial & Geometric Distributions.
Chapter 8 Probability Section R Review. 2 Barnett/Ziegler/Byleen Finite Mathematics 12e Review for Chapter 8 Important Terms, Symbols, Concepts  8.1.
ENGG 2040C: Probability Models and Applications Andrej Bogdanov Spring Random variables part one.
Chapter 5.1 Probability Distributions.  A variable is defined as a characteristic or attribute that can assume different values.  Recall that a variable.
Chapter 5: Random Variables and Discrete Probability Distributions
AP Review Day 2: Discrete Probability. Basic Probability Sample space = all possible outcomes P(A c ) = 1 – P(A) Probabilities have to be between 0 and.
Uncertainty Uncertain Knowledge Probability Review Bayes’ Theorem Summary.
Computer Vision Group Prof. Daniel Cremers Autonomous Navigation for Flying Robots Lecture 5.2: Recap on Probability Theory Jürgen Sturm Technische Universität.
Chapter 12 Probability © 2008 Pearson Addison-Wesley. All rights reserved.
Appendix : Probability Theory Review Each outcome is a sample point. The collection of sample points is the sample space, S. Sample points can be aggregated.
4.1 Probability Distributions Important Concepts –Random Variables –Probability Distribution –Mean (or Expected Value) of a Random Variable –Variance and.
Random Variables and Stochastic Processes – Lecture#13 Dr. Ghazi Al Sukkar Office Hours:
Section 7.4 Use of Counting Techniques in Probability.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
1 Keep Life Simple! We live and work and dream, Each has his little scheme, Sometimes we laugh; sometimes we cry, And thus the days go by.
Chapter 5. Continuous Random Variables. Continuous Random Variables Discrete random variables –Random variables whose set of possible values is either.
CSE 474 Simulation Modeling | MUSHFIQUR ROUF CSE474:
MATH Section 3.1.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 31 Conditional Probability & Conditional Expectation Conditional distributions Computing expectations by conditioning Computing probabilities by.
Copyright © 2010 Pearson Addison-Wesley. All rights reserved. Chapter 3 Random Variables and Probability Distributions.
Chapter 5 Joint Probability Distributions and Random Samples  Jointly Distributed Random Variables.2 - Expected Values, Covariance, and Correlation.3.
Conditional Probability 423/what-is-your-favorite-data-analysis-cartoon 1.
© 2005 McGraw-Hill Ryerson Ltd. 4-1 Statistics A First Course Donald H. Sanders Robert K. Smidt Aminmohamed Adatia Glenn A. Larson.
Basics of Multivariate Probability
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
Random Variable 2013.
Random Variables and Probability Distribution (2)
The distribution function F(x)
Suppose you roll two dice, and let X be sum of the dice. Then X is
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
... DISCRETE random variables X, Y Joint Probability Mass Function y1
ASV Chapters 1 - Sample Spaces and Probabilities
ASV Chapters 1 - Sample Spaces and Probabilities
The Geometric Distributions
Chapter 3-2 Discrete Random Variables
Part II: Discrete Random Variables
Presentation transcript:

Chapter 9: Joint distributions and independence CIS 3033

9.1 Joint distributions: discrete Number of random variables: one, two, and more, especially when they are defined on the same sample space. [Otherwise consider the products of sample spaces, as in Section 2.4.]Section 2.4 What is new: influence between variables, as relation among events. For example, two random variables S and M, the sum and the maximum of two throws of a die.

9.1 Joint distributions: discrete The joint probability mass function of discrete random variables X and Y (on the same sample space Ω) is the function p: R 2 → [0, 1] defined by p(a, b) = P(X = a, Y = b) for −∞< a,b < ∞. The joint distribution function of random variables X and Y is the function F: R 2 → [0, 1] defined by F(a, b) = P(X ≤ a, Y ≤ b) for −∞< a,b < ∞. The marginal probability mass function of discrete random variables X or Y can be obtained from p(a, b) by summing the values of the other variable.

9.1 Joint distributions: discrete

In many cases the joint probability mass functions of X and Y cannot be retrieved from the marginal probability mass functions p X and p Y. This is also the case for the distribution functions.

9.2 Joint distributions: continuous

For the distribution functions, the relation is the same as the discrete case, as given in formula (9.1) and (9.2).

9.3 More than two random variables The joint distribution function F of X 1,X 2,..., X n (all defined in the same Ω) is defined by F(a 1, a 2,..., a n ) = P(X 1 ≤ a 1, X 2 ≤ a 2,..., X n ≤ a n ) for −∞ < a 1, a 2,..., a n < ∞. Joint probability mass function p can be defined for discrete random variables, and joint density function f can be defined for continues random variables, just like the case of two-variable.

9.3 More than two random variables Suppose a vase contains N balls numbered 1, 2,..., N, and we draw n balls without replacement. Since there are N(N−1) · · · (N−n+1) possible combinations for the values of X 1,X 2,..., X n, each having the same probability, the joint probability mass function is given by p(a 1, a 2,..., a n ) = P(X 1 =a 1, X 2 =a 2,..., X n =a n ) = 1 / [N(N − 1) · · · (N − n + 1)], for all distinct values a 1, a 2,..., a n with 1 ≤ a j ≤ N. The marginal distribution of each X i is p Xi (k) = 1/N.

9.4 Independent random variables Random variables X and Y are independent if every event involving only X is independent of every event involving only Y. Random variables that are not independent are called dependent. Random variables X and Y are independent if P(X ≤ a, Y ≤ b) = P(X ≤ a)P(Y ≤ b), that is, the joint distribution function F(a, b) = F X (a)F Y (b), for all possible values of a and b. The same conclusion applies to probability mass function p and density function f.

9.5 Propagation of independence