© 1998, Geoff Kuenning Introduction to Statistics Concentration on applied statistics Statistics appropriate for measurement Today’s lecture will cover.

Slides:



Advertisements
Similar presentations
I OWA S TATE U NIVERSITY Department of Animal Science Using Basic Graphical and Statistical Procedures (Chapter in the 8 Little SAS Book) Animal Science.
Advertisements

Probability Distributions CSLU 2850.Lo1 Spring 2008 Cameron McInally Fordham University May contain work from the Creative Commons.
Statistics review of basic probability and statistics.
Statistics and Quantitative Analysis U4320
Introduction to Summary Statistics
Shivkumar Kalyanaraman Rensselaer Polytechnic Institute 1 ECSE-4690: Experimental Networking Informal Quiz: Prob/Stat Shiv Kalyanaraman:
1 Midterm Review Econ 240A. 2 The Big Picture The Classical Statistical Trail Descriptive Statistics Inferential Statistics Probability Discrete Random.
Chapter 6 Continuous Random Variables and Probability Distributions
Sampling Distributions
Topic 2: Statistical Concepts and Market Returns
Intro to Descriptive Statistics
Probability and Statistics Review
CHAPTER 6 Statistical Analysis of Experimental Data
Measures of Dispersion
Summarizing Measured Data
Review of Probability and Statistics
Central Tendency and Variability
Lecture II-2: Probability Review
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
Summarizing Measured Data
Measures of Central Tendency
Comparing Systems Using Sample Data Andy Wang CIS Computer Systems Performance Analysis.
Objective To understand measures of central tendency and use them to analyze data.
Summarizing Data Chapter 12.
B AD 6243: Applied Univariate Statistics Understanding Data and Data Distributions Professor Laku Chidambaram Price College of Business University of Oklahoma.
Short Resume of Statistical Terms Fall 2013 By Yaohang Li, Ph.D.
Chapter 3 – Descriptive Statistics
Go to Index Analysis of Means Farrokh Alemi, Ph.D. Kashif Haqqi M.D.
Essentials of Marketing Research
Estimation of Statistical Parameters
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Topics: Statistics & Experimental Design The Human Visual System Color Science Light Sources: Radiometry/Photometry Geometric Optics Tone-transfer Function.
Random Sampling, Point Estimation and Maximum Likelihood.
Summarizing Measured Data Andy Wang CIS Computer Systems Performance Analysis.
Measures of Dispersion CUMULATIVE FREQUENCIES INTER-QUARTILE RANGE RANGE MEAN DEVIATION VARIANCE and STANDARD DEVIATION STATISTICS: DESCRIBING VARIABILITY.
1 Statistical Distribution Fitting Dr. Jason Merrick.
Chapter 12.  When dealing with measurement or simulation, a careful experiment design and data analysis are essential for reducing costs and drawing.
0 K. Salah 2. Review of Probability and Statistics Refs: Law & Kelton, Chapter 4.
Statistical analysis Outline that error bars are a graphical representation of the variability of data. The knowledge that any individual measurement.
Measures of central tendency are statistics that express the most typical or average scores in a distribution These measures are: The Mode The Median.
CPE 619 Summarizing Measured Data Aleksandar Milenković The LaCASA Laboratory Electrical and Computer Engineering Department The University of Alabama.
Dr. Serhat Eren 1 CHAPTER 6 NUMERICAL DESCRIPTORS OF DATA.
What does Statistics Mean? Descriptive statistics –Number of people –Trends in employment –Data Inferential statistics –Make an inference about a population.
1 Descriptive Statistics 2-1 Overview 2-2 Summarizing Data with Frequency Tables 2-3 Pictures of Data 2-4 Measures of Center 2-5 Measures of Variation.
Statistics What is statistics? Where are statistics used?
Summarizing Risk Analysis Results To quantify the risk of an output variable, 3 properties must be estimated: A measure of central tendency (e.g. µ ) A.
Describing Samples Based on Chapter 3 of Gotelli & Ellison (2004) and Chapter 4 of D. Heath (1995). An Introduction to Experimental Design and Statistics.
Data Analysis Overview Experimental environment prototype real sys exec- driven sim trace- driven sim stochastic sim Workload parameters System Config.
Comparing Systems Using Sample Data Andy Wang CIS Computer Systems Performance Analysis.
Chapter 20 Statistical Considerations Lecture Slides The McGraw-Hill Companies © 2012.
Lecture 3 Page 1 CS 239, Spring 2007 Variability in Data CS 239 Experimental Methodologies for System Software Peter Reiher April 10, 2007.
Measurements and Their Analysis. Introduction Note that in this chapter, we are talking about multiple measurements of the same quantity Numerical analysis.
CHAPTER – 1 UNCERTAINTIES IN MEASUREMENTS. 1.3 PARENT AND SAMPLE DISTRIBUTIONS  If we make a measurement x i in of a quantity x, we expect our observation.
MEGN 537 – Probabilistic Biomechanics Ch.5 – Determining Distributions and Parameters from Observed Data Anthony J Petrella, PhD.
Selecting Input Probability Distributions. 2 Introduction Part of modeling—what input probability distributions to use as input to simulation for: –Interarrival.
Biostatistics Class 3 Probability Distributions 2/15/2000.
Statistics and probability Dr. Khaled Ismael Almghari Phone No:
AP PSYCHOLOGY: UNIT I Introductory Psychology: Statistical Analysis The use of mathematics to organize, summarize and interpret numerical data.
Statistical analysis.
Comparing Systems Using Sample Data
AP Biology Intro to Statistics
Random Variables Random variables assigns a number to each outcome of a random circumstance, or equivalently, a random variable assigns a number to each.
Statistical analysis.
Central Tendency and Variability
MEGN 537 – Probabilistic Biomechanics Ch.3 – Quantifying Uncertainty
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
BUSINESS MARKET RESEARCH
Introductory Statistics
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

© 1998, Geoff Kuenning Introduction to Statistics Concentration on applied statistics Statistics appropriate for measurement Today’s lecture will cover basic concepts –You should already be familiar with these

© 1998, Geoff Kuenning Independent Events Occurrence of one event doesn’t affect probability of other Examples: –Coin flips –Inputs from separate users –“Unrelated” traffic accidents What about second basketball free throw after the player misses the first?

© 1998, Geoff Kuenning Random Variable Variable that takes values with a specified probability Variable usually denoted by capital letters, particular values by lowercase Examples: –Number shown on dice –Network delay What about disk seek time?

© 1998, Geoff Kuenning Cumulative Distribution Function (CDF) Maps a value a to probability that the outcome is less than or equal to a: Valid for discrete and continuous variables Monotonically increasing Easy to specify, calculate, measure

© 1998, Geoff Kuenning CDF Examples Coin flip (T = 1, H = 2): Exponential packet interarrival times:

© 1998, Geoff Kuenning Probability Density Function (pdf) Derivative of (continuous) CDF: Usable to find probability of a range:

© 1998, Geoff Kuenning Examples of pdf Exponential interarrival times: Gaussian (normal) distribution:

© 1998, Geoff Kuenning Probability Mass Function (pmf) CDF not differentiable for discrete random variables pmf serves as replacement: f(x i ) = p i where p i is the probability that x will take on the value x i

© 1998, Geoff Kuenning Examples of pmf Coin flip: Typical CS grad class size:

© 1998, Geoff Kuenning Expected Value (Mean) Mean Summation if discrete Integration if continuous

© 1998, Geoff Kuenning Variance & Standard Deviation Var(x) = Usually denoted   2 Square root  is called standard deviation

© 1998, Geoff Kuenning Coefficient of Variation (C.O.V. or C.V.) Ratio of standard deviation to mean: Indicates how well mean represents the variable

© 1998, Geoff Kuenning Covariance Given x, y with means  x and  y, their covariance is: –typos on p.181 of book High covariance implies y departs from mean whenever x does

© 1998, Geoff Kuenning Covariance (cont’d) For independent variables, E(xy) = E(x)E(y) so Cov(x,y) = 0 Reverse isn’t true: Cov(x,y) = 0 doesn’t imply independence If y = x, covariance reduces to variance

© 1998, Geoff Kuenning Correlation Coefficient Normalized covariance: Always lies between -1 and 1 Correlation of 1  x ~ y, -1 

© 1998, Geoff Kuenning Mean and Variance of Sums For any random variables, For independent variables,

© 1998, Geoff Kuenning Quantile x value at which CDF takes a value  is called a-quantile or 100  -percentile, denoted by x . If 90th-percentile score on GRE was 1500, then 90% of population got 1500 or less

© 1998, Geoff Kuenning Quantile Example  -quantile 0.5-quantile

© 1998, Geoff Kuenning Median 50-percentile (0.5-quantile) of a random variable Alternative to mean By definition, 50% of population is sub- median, 50% super-median –Lots of bad (good) drivers –Lots of smart (stupid) people

© 1998, Geoff Kuenning Mode Most likely value, i.e., x i with highest probability p i, or x at which pdf/pmf is maximum Not necessarily defined (e.g., tie) Some distributions are bi-modal (e.g., human height has one mode for males and one for females)

© 1998, Geoff Kuenning Examples of Mode Dice throws: Adult human weight: Mode Sub-mode

© 1998, Geoff Kuenning Normal (Gaussian) Distribution Most common distribution in data analysis pdf is: -  x  +  Mean is , standard deviation 

© 1998, Geoff Kuenning Notation for Gaussian Distributions Often denoted N( ,  ) Unit normal is N(0,1) If x has N( ,  ), has N(0,1) The  -quantile of unit normal z ~ N(0,1) is denoted z   so that

© 1998, Geoff Kuenning Why Is Gaussian So Popular? If x i ~ N(  ,   ) and all x i independent, then  i x i is normal with mean  i  i and variance    i 2  i 2 Sum of large no. of independent observations from any distribution is itself normal (Central Limit Theorem)  Experimental errors can be modeled as normal distribution.

© 1998, Geoff Kuenning Central Limit Theorem Sum of 2 coin flips (H=1, T=0): Sum of 8 coin flips:

© 1998, Geoff Kuenning Measured Data But, we don’t know F(x) – all we have is a bunch of observed values – a sample. What is a sample? –Example: How tall is a human? Could measure every person in the world (actually even that’s a sample) Or could measure every person in this room –Population has parameters –Sample has statistics Drawn from population Inherently erroneous Measured Data

© 1998, Geoff Kuenning Central Tendency Sample mean – x (arithmetic mean) –Take sum of all observations and divide by the number of observations Sample median –Sort the observations in increasing order and take the observation in the middle of the series Sample mode –Plot a histogram of the observations and choose the midpoint of the bucket where the histogram peaks

© 1998, Geoff Kuenning Indices of Dispersion Measures of how much a data set varies –Range –Sample variance –And derived from sample variance: Square root -- standard deviation, S Ratio of sample mean and standard deviation – COV s / x –Percentiles Specification of how observations fall into buckets

© 1998, Geoff Kuenning Interquartile Range Yet another measure of dispersion The difference between Q3 and Q1 Semi-interquartile range - Often interesting measure of what’s going on in the middle of the range

© 1998, Geoff Kuenning Which Index of Dispersion to Use? Bounded? Unimodal symmetrical? Range C.O.V Percentiles or SIQR But always remember what you’re looking for Yes No

© 1998, Geoff Kuenning If a data set has a common distribution, that’s the best way to summarize it –Saying a data set is uniformly distributed is more informative than just giving its sample mean and standard deviation So how do you determine if your data set fits a distribution? –Plot a histogram –Quantile-quantile plot –Statistical methods Determining a Distribution for a Data Set

© 1998, Geoff Kuenning Quantile-Quantile Plots Most suitable for small data sets Basically -- guess a distribution Plot where quantiles of data should fall in that distribution –Against where they actually fall in the sample If plot is close to linear, data closely matches that distribution

© 1998, Geoff Kuenning Obtaining Theoretical Quantiles We need to determine where the quantiles should fall for a particular distribution Requires inverting the CDF for that distribution q i = F(x i ) t x i = F -1 (q i ) –Then determining quantiles for observed points –Then plugging in quantiles to inverted CDF

© 1998, Geoff Kuenning Inverting a Distribution Many common distributions have already been inverted (how convenient…) For others that are hard to invert, tables and approximations are often available (nearly as convenient)

© 1998, Geoff Kuenning Is Our Example Data Set Normally Distributed? Our example data set was -17, -10, -4.8, 2, 5.4, 27, 84.3, 92, 445, 2056 Does this match the normal distribution? The normal distribution doesn’t invert nicely –But there is an approximation for N(0,1): –Or invert numerically

© 1998, Geoff Kuenning Data For Example Normal Quantile-Quantile Plot iq i y i x i

© 1998, Geoff Kuenning Example Normal Quantile-Quantile Plot Definitely not normal –Because it isn’t linear –Tail at high end is too long for normal

© 1998, Geoff Kuenning Estimating Population from Samples How tall is a human? –Measure everybody in this room –Calculate sample mean –Assume population mean  equals What is the error in our estimate?

© 1998, Geoff Kuenning Estimating Error Sample mean is a random variable  Mean has some distribution  Multiple sample means have “mean of means” Knowing distribution of means can estimate error

Confidence Intervals Sample mean value is only an estimate of the true population mean Bounds c 1 and c 2 such that there is a high probability, 1- , that the population mean is in the interval (c 1,c 2 ): Prob{ c 1 <  < c 2 } =1-  where  is the significance level and 100(1-  ) is the confidence level Overlapping confidence intervals is interpreted as “not statistically different”

© 1998, Geoff Kuenning Confidence Intervals How tall is Fred? –Suppose average human height is 170 cm  Fred is 170 cm tall Yeah, right –Suppose 90% of humans are between 155 and 190 cm  Fred is between 155 and 190 cm We are 90% confident that Fred is between 155 and 190 cm

© 1998, Geoff Kuenning Confidence Interval of Sample Mean Knowing where 90% of sample means fall, we can state a 90% confidence interval Key is Central Limit Theorem: –Sample means are normally distributed –Only if independent –Mean of sample means is population mean  –Standard deviation (standard error) is

© 1998, Geoff Kuenning Estimating Confidence Intervals Two formulas for confidence intervals –Over 30 samples from any distribution: z-distribution –Small sample from normally distributed population: t-distribution Common error: using t-distribution for non-normal population –Central Limit Theorem often saves us

© 1998, Geoff Kuenning The z Distribution Interval on either side of mean: Significance level  is small for large confidence levels Tables of z are tricky: be careful!

© 1998, Geoff Kuenning Example of z Distribution 35 samples: Sample mean = Standard deviation s = n = 35 90% confidence interval is

© 1998, Geoff Kuenning Graph of z Distribution Example

© 1998, Geoff Kuenning The t Distribution Formula is almost the same: Usable only for normally distributed populations! But works with small samples

© 1998, Geoff Kuenning Example of t Distribution 10 height samples: Sample mean = Standard deviation s = 25.1, n = 10 90% confidence interval is 99% interval is (144.7, 196.3)

© 1998, Geoff Kuenning Graph of t Distribution Example

© 1998, Geoff Kuenning Getting More Confidence Asking for a higher confidence level widens the confidence interval –Counterintuitive? How tall is Fred? –90% sure he’s between 155 and 190 cm –We want to be 99% sure we’re right –So we need more room: 99% sure he’s between 145 and 200 cm

For Discussion Next Tuesday Project Proposal 1.Statement of hypothesis 2.Workload decisions 3.Metrics to be used 4.Method Reading for Next Time Elson, Girod, Estrin, “Fine-grained Network Time Synchronization using Reference Broadcasts, OSDI 2002 – see readings.html

For Discussion Today Bring in one either notoriously bad or exceptionally good example of data presentation from your proceedings. The bad ones are more fun. Or if you find something just really different, please show it.