Laws of division of casual sizes. Binomial law of division.

Slides:



Advertisements
Similar presentations
Chapter 2 Multivariate Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Advertisements

Chapter 2 Concepts of Prob. Theory
CS433: Modeling and Simulation
Random Variables ECE460 Spring, 2012.
Random Variable A random variable X is a function that assign a real number, X(ζ), to each outcome ζ in the sample space of a random experiment. Domain.
Introduction to stochastic process
Probability Review 1 CS479/679 Pattern Recognition Dr. George Bebis.
Short review of probabilistic concepts
Continuous Random Variables. For discrete random variables, we required that Y was limited to a finite (or countably infinite) set of values. Now, for.
Probability Theory Part 2: Random Variables. Random Variables  The Notion of a Random Variable The outcome is not always a number Assign a numerical.
Probability Distributions
Short review of probabilistic concepts Probability theory plays very important role in statistics. This lecture will give the short review of basic concepts.
Sections 4.1, 4.2, 4.3 Important Definitions in the Text:
Tch-prob1 Chapter 4. Multiple Random Variables Ex Select a student’s name from an urn. S In some random experiments, a number of different quantities.
1 Engineering Computation Part 5. 2 Some Concepts Previous to Probability RANDOM EXPERIMENT A random experiment or trial can be thought of as any activity.
Probability Distributions Random Variables: Finite and Continuous Distribution Functions Expected value April 3 – 10, 2003.
Chapter 4: Joint and Conditional Distributions
Review of Probability and Random Processes
Eighth lecture Random Variables.
Probability and Statistics Review Thursday Sep 11.
1 Fin500J Topic 10Fall 2010 Olin Business School Fin500J: Mathematical Foundations in Finance Topic 10: Probability and Statistics Philip H. Dybvig Reference:
Random Variable and Probability Distribution
Chapter 21 Random Variables Discrete: Bernoulli, Binomial, Geometric, Poisson Continuous: Uniform, Exponential, Gamma, Normal Expectation & Variance, Joint.
NIPRL Chapter 2. Random Variables 2.1 Discrete Random Variables 2.2 Continuous Random Variables 2.3 The Expectation of a Random Variable 2.4 The Variance.
Joint Distribution of two or More Random Variables
Probability and Probability Distributions
Prof. SankarReview of Random Process1 Probability Sample Space (S) –Collection of all possible outcomes of a random experiment Sample Point –Each outcome.
OUTLINE Probability Theory Linear Algebra Probability makes extensive use of set operations, A set is a collection of objects, which are the elements.
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee.
Chapter 12 Review of Calculus and Probability
Tch-prob1 Chap 3. Random Variables The outcome of a random experiment need not be a number. However, we are usually interested in some measurement or numeric.
IRDM WS Chapter 2: Basics from Probability Theory and Statistics 2.1 Probability Theory Events, Probabilities, Random Variables, Distributions,
Theory of Probability Statistics for Business and Economics.
1 Lecture 4. 2 Random Variables (Discrete) Real-valued functions defined on a sample space are random vars. determined by outcome of experiment, we can.
CPSC 531: Probability Review1 CPSC 531:Probability & Statistics: Review II Instructor: Anirban Mahanti Office: ICT 745
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Random Variables. A random variable X is a real valued function defined on the sample space, X : S  R. The set { s  S : X ( s )  [ a, b ] is an event}.
STA347 - week 51 More on Distribution Function The distribution of a random variable X can be determined directly from its cumulative distribution function.
Chapter Four Random Variables and Their Probability Distributions
CS433 Modeling and Simulation Lecture 03 – Part 01 Probability Review 1 Dr. Anis Koubâa Al-Imam Mohammad Ibn Saud University
Tim Marks, Dept. of Computer Science and Engineering Random Variables and Random Vectors Tim Marks University of California San Diego.
LECTURE 17 THURSDAY, 22 OCTOBER STA 291 Fall
Probability Refresher. Events Events as possible outcomes of an experiment Events define the sample space (discrete or continuous) – Single throw of a.
Stats Probability Theory Summary. The sample Space, S The sample space, S, for a random phenomena is the set of all possible outcomes.
Conditional Probability Mass Function. Introduction P[A|B] is the probability of an event A, giving that we know that some other event B has occurred.
Probability (outcome k) = Relative Frequency of k
Copyright © 2006 The McGraw-Hill Companies, Inc. All rights reserved. McGraw-Hill/Irwin Review of Statistics I: Probability and Probability Distributions.
Random Variables. Numerical Outcomes Consider associating a numerical value with each sample point in a sample space. (1,1) (1,2) (1,3) (1,4) (1,5) (1,6)
1 Probability: Introduction Definitions,Definitions, Laws of ProbabilityLaws of Probability Random VariablesRandom Variables DistributionsDistributions.
Chapter 2: Probability. Section 2.1: Basic Ideas Definition: An experiment is a process that results in an outcome that cannot be predicted in advance.
Chapter 6 Large Random Samples Weiqi Luo ( 骆伟祺 ) School of Data & Computer Science Sun Yat-Sen University :
1 Review of Probability and Random Processes. 2 Importance of Random Processes Random variables and processes talk about quantities and signals which.
© by Yu Hen Hu 1 ECE533 Digital Image Processing Review of Probability, Random Process, Random Field for Image Processing.
디지털통신 Random Process 임 민 중 동국대학교 정보통신공학과 1.
APPENDIX A: A REVIEW OF SOME STATISTICAL CONCEPTS
Probability and Statistics for Computer Scientists Second Edition, By: Michael Baron Chapter 3: Discrete Random Variables and Their Distributions CIS.
What is Probability? Quantification of uncertainty.
Appendix A: Probability Theory
Chapter Four Random Variables and Their Probability Distributions
Chapter. 5_Probability Distributions
Review of Probability and Estimators Arun Das, Jason Rebello
Advanced Artificial Intelligence
How accurately can you (1) predict Y from X, and (2) predict X from Y?
M248: Analyzing data Block A UNIT A3 Modeling Variation.
Chapter 2. Random Variables
Experiments, Outcomes, Events and Random Variables: A Revisit
Quick Review of Probability
A random experiment gives rise to possible outcomes, but any particular outcome is uncertain – “random”. For example, tossing a coin… we know H or T will.
Presentation transcript:

Laws of division of casual sizes. Binomial law of division.

Sensor Model The prior (|I) The sensor reliability P(|I) Likelihood p(y|,,I)

Outline Random experiments Samples and Sample Space Events Probability Space Axioms of Probability Conditional Probability Bayes’ Rule Independence of Events

Random Experiments A random experiment is an experiment in which the outcome varies in a unpredictable fashion when the experiment is repeated under the same condition. A random experiment is specified by stating an experimental procedure and a set of one or more measurements or observations. Examples: E1:Toss a coin three times and note the sides facing up (heads or tail) E2:Pick a number at random between zero and one. E3: Poses done by a rookie dancer

Samples and Sample Space A sample point (o) or an outcome of a random experiment is defined as a result that cannot be decomposed into other results. Sample Space (S): defined as the set of all possible outcomes from a random experiment. Countable or discrete sample space, one-to-one correspondence between outcomes and integers Uncountable or continuous sample space

Events A event is a subset of the sample space S, a set of samples. Two special events: Certain event: S Impossible or null event: 

Probability Space {S,E,P} S: Sample space, the space of the outcomes from a random experiment {o} E: Event space, collection of subsets of the sample space, {A} P: Probability measure of a event P(A), ranges [0,1], encoding how likely an event will happen. S

Axioms of Probability 0P(A) P(S)=1 If A ∩ B =, then P(A U B)=P(A)+P(B) Given a sequence of event, Ai, if  ij, Ai  Bj =, P(U i=1 Ai )= i=1 P(Ai) referred to as countable additivity

Some Corollaries P(Ac) = 1 – P(A) P() = 0 P(A)  1 Given a sequence of event, Ai,..., An, if  ij, Ai ∩ Bj =, P(U ni=1 Ai )= ni=1 P(Ai) P(AUB)=P(A)+P(B)-P(A ∩ B) B A A ∩ B

Conditional Probability Imagine that P(A) is proportional to the size of the area B A A ∩ B S

Theorem of Total Probability Let {Bi} be a partition of the sample space S A B1 B2 B7 B4 B6 B5 B3 P(A)=  7i=1 P(A ∩ Bi ) = 7i=1 P(A | Bi ) P(Bi)

Bayes’ Rule Let {Bi} be a partition of the sample space S. Suppose that event A occurs, what is the probability of the event Bj? By the definition of conditional probability we have a priori a posteriori

Independence of Events If knowledge of the occurrence of an event B does not alter the probability of some other event A, then it would be natural to say that event A is independent of B. The most common application of the independence concept is in making the assumption that the events of separate experiments are independent, which are referred as independent experiments.

An example of Independent Events Experiment: Toss a coin twice E1: the first toss is a head E2: the second toss is a tail Consider the experiment constructed by concatenating two separate random experiment, toss a coin once. H T (T,H) (T,T) (H,H) (H,T)

Last time … Random experiments Samples and Sample Space Events Probability Space Axioms of Probability Conditional Probability Bayes’ Rule Independence of Events

Random Variables A random variable X is a function that assigns a real number, X(), to each outcome  in the sample space of a random experiment. S X() Real line  x SX

Random Variables Let SX be the set of the sample space. X() can be considered as a new random experiments with outcomes X() as a function of , the outcome of the original experiment. S X() Real line  x SX

Examples E1: Toss a coin three times S={HHH,HHT,HTH,HTT, THH,THT,TTH,TTT} X()=number of heads in three coins tosses. Note that sometimes a few  share the same value of X(). SX={0,1,2,3} X is then a random variable taking on the values in the set SX If the outcome  of some experiment is already a numerical value, we can immediately reinterpret the outcome as a random variable defined by X()= .

Probabilities and Random Variables Let A be an event of the original experiment. There is a corresponding equivalent event B in the new experiment, with X() as outcome, such that A={S: X()  B} or B={X()SX : A} S SX Real line X() B A

Probabilities and Random Variables P(B)=P(A)=P ({: X()B}) Two typical events: B:{X=x} B: {X in I} S SX Real line X() B A

The Cumulative Distribution Function The cumulative distribution function (cdf) of a random variable X is defined as the probability of the event {Xx}: FX(x)=P(Xx) for -<x<+ In term of the underling random experiment, FX(x)=P(Xx) =P ({: X()  x}) The cdf is simply a convenient way of specifying the probability of all semi-infinite intervals (-,x].

Major Properties of cdf 0 FX(x) 1 limx=1 limx-=0 FX(x) is a non-decreasing function of x, that is, if a < b, then FX(a)  FX(b).

Probability of an event Let A = {a<Xb} and b>a P(A)=P{a<Xb}=FX(b) - FX(a).

The Probability Mass Function Discrete random variables are those random variables taking values at the countable set of points. The probability mass function (pmf) is the set of probability pX(xk)=P(X=xk) of the elements in SX. S={HHH,HHT,HTH,HTT,THH,THT,TTH,TTT} SX={0,1,2,3}, pX(0)=1/8, pX(1)=3/8, pX(2)=3/8, pX(3)=1/8 1 2 3 X 1/8 1/2 7/8 FX(x) X 1 2 3 1/8 3/8 pX(x)

The Probability Density Function A continuous random variable is defined as a random variable whose cdf is continuous everywhere, and which, in addition, is sufficiently smooth that it can be written as an integral of some nonnegative function f(x): The probability density function of X (pdf) is defined as the derivative of FX(x):

The Probability Density Function pX(x) a b When a b, P(aX b) pX((a+b)/2) |b-a| X Support of X

An Example pX(x) = 1, when x [0,1], otherwise 0 X 1 pX(x) X 1 FX(x)

FX|Yy(x)=P(Xx|Yy)=P(Xx,Y y)/P(Yy) Multiple R.V. Joint cdf : FX,Y(x,y)=P(Xx,Y y) Conditional cdf: if P(Yy) > 0, FX|Yy(x)=P(Xx|Yy)=P(Xx,Y y)/P(Yy) Joint pdf: pX,Y(x,y) the 2nd order derivatives of the joint cdf, usually independent of the order when pdfs are continuous. Marginal pdf: pY (y) = X pX,Y(x,y)dx Conditional pdf: pX|Y=y(x)=pX,Y(x,y)/ pY (y)

Expectation The expectation of a random variable is given by the weighted average of the values in the support of the random variable

Smoothing Property of Conditional Expectation EY|X {Y|X=x}=g(x) E{Y}=EX{EY|X {Y|X=x}}

Fundamental Theorem of Expectations Let Y=g(X) Recall that E{Y}=EX{EY|X {Y|X=x}}

Var(X)=X (x-E(X))2 pX(x) dx Variance of a R.V. Weighted difference from the expectation Var(X)=X (x-E(X))2 pX(x) dx

Last Time … Random Variables cdf, pmf, pdf, Expectation, variances,

Correlation and Covariance of Two R.V. Let X and Y be two random variables. The correlation between X and Y is given by E{XY}= X,Y xy pX,Y(x,y) dxdy Covariance of X and Y is given by COV(X,Y)=E{(X-E{X})(Y-E{Y})} =X,Y (X-E{X})(Y-E{Y}) pX,Y(x,y) dxdy =E{XY}-E{X}E{Y} When COV(X,Y)=0 or E{XY}=E{X}E{Y}, X and Y are (linearly) uncorrelated. When E{XY}=0, X and Y are orthogonal.

Correlation coefficient

Independence  pX,Y(x,y) = pX (x)pY (y)  x, y Independent R.V.s If pX|Y=y(x)=pX,Y(x,y)/ pY (y)= pX (x) for all x and y, X and Y are independent random variables, i.e. Independence  pX,Y(x,y) = pX (x)pY (y)  x, y

Independent v.s. Uncorrelated Independent  Uncorrelated Uncorrelated  Independent Example: Uncorrelated but dependent random variables: Let  U[0,2] and X=cos(), Y=sin() E{X}= (1/2)02 cos() d = 0, E{Y}=0; COV(X,Y)= (1/2)02 cos()sin() d = (1/4)02 sin(2) d = 0 If X and Y are jointly Gaussian, Independent Uncorrelated

Covariance Matrix of Random Vector X = (X1,X2,…,Xn)T The covariance matrix of random vector X is given by CX = E{(X-E{X})(X-E{X})T} CX(i,j) = COV(Xi,Xj) Properties of CX Symmetric, CX = CXT, i.e. CX(i,j) = CX(j,i) Semi-positive definite Rn T CX  = 0 Since VAR(T X) = T CX  = 0

First Order Markov Process Let {X1,X2,…,Xn} be a sequence of random variables (or vectors), e.g. the joint angle vectors in a gait cycle over a period of time. We call this process is a first order Markov process if the following Markov property is true: P(Xn=xn|Xn-1=xn-1,…,X1=x1) = P(Xn=xn|Xn-1=xn-1) i.e. P(Future|Present,past)=P(Future|Present) Process with memory limited to one step only

Chain Rule P(Xn=xn,Xn-1=xn-1,…,X1=x1) = P(Xn=xn|Xn-1=xn-1,…,X1=x1) P(Xn-1=xn-1,…,X1=x1) = P(Xn=xn|Xn-1=xn-1) P(Xn-1=xn-1,…,X1=x1) = P(Xn=xn|Xn-1=xn-1) P(Xn-1=xn-1|Xn-2=xn-2)… P(X1=x1) = P(X1=x1) k=2n P(Xk=xk|Xk-1=xk-1)

Dynamic Systems System and Observation equations Chain rule , ,

Discrete State Markov Chains Given a finite discrete set S of states, a Markov chain process possesses one of these states at each unit of time. The process either stays in the same state or moves to some other state in S. S1 S2 S3 1/2 1/3 2/3 1/6

Good luck