Download presentation
Presentation is loading. Please wait.
1
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 Basic Ideas in Probability and Statistics for Experimenters: Part I: Qualitative Discussion Shivkumar Kalyanaraman Rensselaer Polytechnic Institute shivkuma@ecse.rpi.edu http://www.ecse.rpi.edu/Homepages/shivkuma Based in part upon slides of Prof. Raj Jain (OSU) He uses statistics as a drunken man uses lamp-posts – for support rather than for illumination … A. Lang
2
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 2 q Why Probability and Statistics: The Empirical Design Method… q Qualitative understanding of essential probability and statistics q Especially the notion of inference and statistical significance q Key distributions & why we care about them… q Reference: Chap 12, 13 (Jain), Chap 2-3 (Box,Hunter,Hunter), and q http://mathworld.wolfram.com/topics/ProbabilityandStatistics.html http://mathworld.wolfram.com/topics/ProbabilityandStatistics.html Overview
3
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 3 Why Care About Prob. & Statistics? How to make this empirical design process EFFICIENT?? How to avoid pitfalls in inference!
4
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 4 Probability q Think of probability as modeling an experiment q The set of all possible outcomes is the sample space: S q Classic “Experiment”: Tossing a die:S = {1,2,3,4,5,6} q Any subset A of S is an event: A = {the outcome is even} = {2,4,6}
5
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 5 Probability of Events: Axioms P is the Probability Mass function if it maps each event A, into a real number P(A), and: i.) ii.) P(S) = 1 iii.)If A and B are mutually exclusive events then,
6
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 6 Probability of Events …In fact for any sequence of pair-wise- mutually-exclusive events, we have
7
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 7 Other Properties q q q q q q q q Derived by breaking up above sets into mutually exclusive pieces and comparing to fundamental axioms!!
8
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 8 Conditional Probability = (conditional) probability that the outcome is in A given that we know the outcome in B Example: Toss one die. Note that:
9
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 9 Independence q Events A and B are independent if P(AB) = P(A)P(B). q Also:and q Example: A card is selected at random from an ordinary deck of cards. q A=event that the card is an ace. q B=event that the card is a diamond.
10
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 10 Random Variable as a Measurement q We cannot give an exact description of a sample space in these cases, but we can still describe specific measurements on them q The temperature change produced. q The number of photons emitted in one millisecond. q The time of arrival of the packet.
11
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 11 Random Variable as a Measurement q Thus a random variable can be thought of as a measurement on an experiment
12
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 12 Probability Mass Function for a Random Variable q The probability mass function (PMF) for a (discrete valued) random variable X is: q Note thatfor q Also for a (discrete valued) random variable X
13
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 13 Probability Distribution Function (pdf) a.k.a. frequency histogram, p.m.f (for discrete r.v.)
14
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 14 Cumulative Distribution Function q The cumulative distribution function (CDF) for a random variable X is q Note that is non-decreasing in x, i.e. q Also and
15
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 15 PMF and CDF: Example
16
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 16 Expectation of a Random Variable q The expectation (average) of a (discrete-valued) random variable X is q Three coins example:
17
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 17 Expectation (Mean) = Center of Gravity
18
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 18 Median, Mode q Median = F -1 (0.5), where F = CDF q Aka 50% percentile element q I.e. Order the values and pick the middle element q Used when distribution is skewed q Mode: Most frequent or highest probability value q Multiple modes are possible q Need not be the “central” element q Mode may not exist (eg: uniform distribution) q Used with categorical variables
19
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 19
20
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 20
21
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 21 Measures of Spread/Dispersion: Why Care? You can drown in a river of average depth 6 inches!
22
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 22 q Variance: second moment around the mean: q 2 = E((X- ) 2 ) q Standard deviation = q Coefficient of Variation (C.o.V.)= / q SIQR= Semi-Inter-Quartile Range (used with median = 50 th percentile) q (75 th percentile – 25 th percentile)/2 Standard Deviation, Coeff. Of Variation, SIQR
23
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 23 Covariance and Correlation: Measures of Dependence q Covariance: = q For i = j, covariance = variance! q Independence => covariance = 0 (not vice-versa!) q Correlation (coefficient) is a normalized (or scaleless) form of covariance: q Between –1 and +1. q Zero => no correlation (uncorrelated). q Note: uncorrelated DOES NOT mean independent!
24
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 24 Continuous-valued Random Variables q So far we have focused on discrete(-valued) random variables, e.g. X(s) must be an integer q Examples of discrete random variables: number of arrivals in one second, number of attempts until success q A continuous-valued random variable takes on a range of real values, e.g. X(s) ranges from 0 to as s varies. q Examples of continuous(-valued) random variables: time when a particular arrival occurs, time between consecutive arrivals.
25
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 25 Continuous-valued Random Variables q Thus, for a continuous random variable X, we can define its probability density function (pdf) q Note that since is non-decreasing in x we have for all x.
26
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 26 Properties of Continuous Random Variables q From the Fundamental Theorem of Calculus, we have q In particular, q More generally,
27
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 27 Expectation of a Continuous Random Variable q The expectation (average) of a continuous random variable X is given by q Note that this is just the continuous equivalent of the discrete expectation
28
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 28 Important (Discrete) Random Variable: Bernoulli q The simplest possible measurement on an experiment: q Success (X = 1) or failure (X = 0). q Usual notation: q E(X)=
29
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 29 Important (discrete) Random Variables: Binomial q Let X = the number of success in n independent Bernoulli experiments ( or trials). P(X=0) = P(X=1) = P(X=2)= In general, P(X = x) = Binomial Variables are useful for proportions (of successes. Failures) for a small number of repeated experiments. For larger number (n), under certain conditions (p is small), Poisson distribution is used.
30
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 30 Binomial can be skewed or normal Depends upon p and n !
31
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 31 Important Random Variable: Poisson q A Poisson random variable X is defined by its PMF: Where > 0 is a constant q Exercise: Show that and E(X) = q Poisson random variables are good for counting frequency of occurrence: like the number of customers that arrive to a bank in one hour, or the number of packets that arrive to a router in one second.
32
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 32 Important Continuous Random Variable: Exponential q Used to represent time, e.g. until the next arrival q Has PDF for some > 0 q Properties: q Need to use integration by Parts!
33
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 33 Memoryless Property of the Exponential q An exponential random variable X has the property that “the future is independent of the part”, i.e. the fact that it hasn’t happened yet, tells us nothing about how much longer it will take. q In math terms
34
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 34 Important Random Variables: Normal
35
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 35 Normal Distribution: PDF & CDF q PDF: q With the transformation: (a.k.a. unit normal deviate) q z-normal-PDF: z
36
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 36 Why is Gaussian Important? Uniform distribution looks nothing like bell shaped (gaussian)! Large spread ( )! Sample mean of uniform distribution (a.k.a sampling distribution), after very few samples looks remarkably gaussian, with decreasing ! CENTRAL LIMIT TENDENCY!
37
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 37 Other interesting facts about Gaussian q Uncorrelated r.vs. + gaussian => INDEPENDENT! q Important in random processes (I.e. sequences of random variables) q Random variables that are independent, and have exactly the same distribution are called IID (independent & identically distributed) q IID and normal with zero mean and variance 2 => IIDN(0, 2 )
38
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 38 Height & Spread of Gaussian Can Vary!
39
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 39 Rapidly Dropping Tail Probability! Sample mean is a gaussian r.v., with x = & s = /(n) 0.5 With larger number of samples, avg of sample means is an excellent estimate of true mean. If (original) is known, invalid mean estimates can be rejected with HIGH confidence!
40
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 40 Confidence Interval q Probability that a measurement will fall within a closed interval [a,b]: (mathworld definition…) = (1- ) q Jain: the interval [a,b] = “confidence interval”; q the probability level, 100(1- )= “confidence level”; q = “significance level” q Sampling distribution for means leads to high confidence levels, I.e. small confidence intervals
41
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 41 Meaning of Confidence Interval
42
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 42 Statistical Inference: Is A = B ? Note: sample mean y A is not A, but its estimate! Is this difference statistically significant? Is the null hypothesis y A = y B false ?
43
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 43 Step 1: Plot the samples
44
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 44 Compare to (external) reference distribution (if available) Since 1.30 is at the tail of the reference distribution, the difference between means is NOT statistically significant!
45
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 45 Random Sampling Assumption! Under random sampling assumption, and the null hypothesis of y A = y B, we can view the 20 samples from a common population & construct a reference distributions from the samples itself !
46
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 46 t-distribution: Create a Reference Distribution from the Samples Itself!
47
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 47 t-distribution
48
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 48 Statistical Significance with Various Inference Techniques Normal population assumption not required Random sampling assumption required Std.dev. estimated from samples itself! t-distribution an approx. for gaussian!
49
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 49 Normal, 2 & t-distributions: Useful for Statistical Inference
50
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 50 Relationship between Confidence Intervals and Comparisons of Means
Similar presentations
© 2024 SlidePlayer.com. Inc.
All rights reserved.