Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department.

Slides:



Advertisements
Similar presentations
Physics 114: Lecture 9 Probability Density Functions Dale E. Gary NJIT Physics Department.
Advertisements

MA-250 Probability and Statistics Nazar Khan PUCIT Lecture 3.
Sampling Distributions (§ )
Sampling Distributions
Econ 140 Lecture 31 Univariate Populations Lecture 3.
Statistical Background
BCOR 1020 Business Statistics
Physics 310 Errors in Physical Measurements Error definitions Measurement distributions Central measures.
CE 428 LAB IV Error Analysis (Analysis of Uncertainty) Almost no scientific quantities are known exactly –there is almost always some degree of uncertainty.
Variance Fall 2003, Math 115B. Basic Idea Tables of values and graphs of the p.m.f.’s of the finite random variables, X and Y, are given in the sheet.
Lecture II-2: Probability Review
Chapter 9 Numerical Integration Numerical Integration Application: Normal Distributions Copyright © The McGraw-Hill Companies, Inc. Permission required.
Continuous Probability Distribution  A continuous random variables (RV) has infinitely many possible outcomes  Probability is conveyed for a range of.
STANDARD SCORES AND THE NORMAL DISTRIBUTION
ANALYTICAL CHEMISTRY CHEM 3811
Standard error of estimate & Confidence interval.
Variance and Standard Deviation. Variance: a measure of how data points differ from the mean Data Set 1: 3, 5, 7, 10, 10 Data Set 2: 7, 7, 7, 7, 7 What.
Today: Central Tendency & Dispersion
V. Rouillard  Introduction to measurement and statistical analysis ASSESSING EXPERIMENTAL DATA : ERRORS Remember: no measurement is perfect – errors.
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
6 - 1 Basic Univariate Statistics Chapter Basic Statistics A statistic is a number, computed from sample data, such as a mean or variance. The.
STATISTIC & INFORMATION THEORY (CSNB134) MODULE 2 NUMERICAL DATA REPRESENTATION.
© Copyright McGraw-Hill CHAPTER 6 The Normal Distribution.
Basic Statistics - Concepts and Examples
Estimation of Statistical Parameters
Physics 114: Lecture 10 PDFs Part Deux Dale E. Gary NJIT Physics Department.
PROBABILITY & STATISTICAL INFERENCE LECTURE 3 MSc in Computing (Data Analytics)
Measurement Uncertainties Physics 161 University Physics Lab I Fall 2007.
Some Useful Continuous Probability Distributions.
Measures of Variability In addition to knowing where the center of the distribution is, it is often helpful to know the degree to which individual values.
Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics Statistics & Econometrics.
Normal distribution and intro to continuous probability density functions...
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Chapter 6: Random Errors in Chemical Analysis CHE 321: Quantitative Chemical Analysis Dr. Jerome Williams, Ph.D. Saint Leo University.
Sullivan – Fundamentals of Statistics – 2 nd Edition – Chapter 3 Section 2 – Slide 1 of 27 Chapter 3 Section 2 Measures of Dispersion.
Physics 114: Lecture 14 Mean of Means Dale E. Gary NJIT Physics Department.
Chapter 2 Statistical Background. 2.3 Random Variables and Probability Distributions A variable X is said to be a random variable (rv) if for every real.
Central Tendency & Dispersion
ME Mechanical and Thermal Systems Lab Fall 2011 Chapter 3: Assessing and Presenting Experimental Data Professor: Sam Kassegne, PhD, PE.
Copyright © 2014 by McGraw-Hill Higher Education. All rights reserved. Essentials of Business Statistics: Communicating with Numbers By Sanjiv Jaggia and.
CY1B2 Statistics1 (ii) Poisson distribution The Poisson distribution resembles the binomial distribution if the probability of an accident is very small.
Sampling Fundamentals 2 Sampling Process Identify Target Population Select Sampling Procedure Determine Sampling Frame Determine Sample Size.
R.Kass/F02 P416 Lecture 1 1 Lecture 1 Probability and Statistics Introduction: l The understanding of many physical phenomena depend on statistical and.
CHAPTER 2.3 PROBABILITY DISTRIBUTIONS. 2.3 GAUSSIAN OR NORMAL ERROR DISTRIBUTION  The Gaussian distribution is an approximation to the binomial distribution.
1 Probability and Statistics Confidence Intervals.
ERT 207 Analytical Chemistry ERT 207 ANALYTICAL CHEMISTRY Dr. Saleha Shamsudin.
CHAPTER – 1 UNCERTAINTIES IN MEASUREMENTS. 1.3 PARENT AND SAMPLE DISTRIBUTIONS  If we make a measurement x i in of a quantity x, we expect our observation.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
m/sampling_dist/index.html.
Chapter 6: Descriptive Statistics. Learning Objectives Describe statistical measures used in descriptive statistics Compute measures of central tendency.
This represents the most probable value of the measured variable. The more readings you take, the more accurate result you will get.
Chapter 7 Continuous Probability Distributions and the Normal Distribution.
Instrumental Analysis Elementary Statistics. I. Significant Figures The digits in a measured quantity that are known exactly plus one uncertain digit.
Advanced Quantitative Techniques
SUR-2250 Error Theory.
Statistical Methods Michael J. Watts
Lecture 1 Probability and Statistics
Statistical Methods Michael J. Watts
Physics 114: Lecture 5 Uncertainties in Measurement
Measures of Central Tendency
The normal distribution
Arithmetic Mean This represents the most probable value of the measured variable. The more readings you take, the more accurate result you will get.
Normal Distribution and The Empirical Rule
Introduction to Gaussian Errors
Sampling Distributions (§ )
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
These probabilities are the probabilities that individual values in a sample will fall in a 50 gram range, and thus represent the integral of individual.
Advanced Algebra Unit 1 Vocabulary
CHAPTER – 1.2 UNCERTAINTIES IN MEASUREMENTS.
Presentation transcript:

Physics 114: Lecture 7 Uncertainties in Measurement Dale E. Gary NJIT Physics Department

February 08, 2010 Some Terms  Accuracy—How close the measurements are to the “true” value (note that we may not always know the true value).  Precision—How close repeated measurements are to each other. A measure of the spread of data points. One can make measurements that are highly accurate (their mean is close to the true value) even though they may not be very precise (large spread of measurements). Conversely, on can make very precise measurements that are not accurate.  Errors—Deviations of measurements from the “true” value. Error here does not mean a blunder! Also referred to as uncertainties. Systematic Errors—deviations from the “true” value that are very reproducible, generally due to some uncorrected effect of an instrument or measurement technique. An example is reading a scale slightly off the vertical, which may systematically give a too-high or too-low reading. Statistical, or Random Errors—fluctuations in measurements that result in their being both too high and too low, due to how precisely the measurement can be made, and which are amenable to reduction by doing repeated measurements.

February 08, 2010 Parent and Sample Distributions  Imagine a process for manufacturing ball bearings. Although each ball bearing is nominally the same, any process is going to cause slight deviations in shape, size, or other measure. If we measure the weight, say, of an infinite number of such ball bearings, these weight measurements will spread into a distribution around some mean value. This hypothetical, infinite distribution is called the parent distribution. The parent distribution’s spread depends, obviously, on how precise the manufacturing process is.  We can never measure an infinite number of ball bearings. Instead, we measure a smaller subset of ball bearings, and from this sample we again find that our measurements spread into a distribution around the sample mean. This finite distribution is called the sample distribution.  In the limit of an infinite sample, of course, the sample distribution should become the parent distribution (assuming we have no systematic errors).

February 08, 2010 Example Sample & Parent Dist.  At the left are the sample distributions for a series of 16 sets of 50 measurements: x = -4:0.5:4; for i = 1:16; subplot(4,4,i); hist(randn(1,50),x); axis([-4,4,0,20]); end  At the right is the sum of these 16 measurements (equivalent to 800 measurements). Apparently 800 is close to infinity, since the sample distribution now is quite close to the parent distribution (red line).

February 08, 2010 What To Do When the “True” Value is Unknown—The Mean  When we do not know the “true” value that we are comparing our sample to, we can take the mean of the measurements as an approximation of the “true” value.  Of course, the mean of the parent population is

February 08, 2010 Probability and Median  The spread of values about the mean in the parent population (that is, the histogram) forms a function called a probability density function (PDF). We will be using this term many, many times during the course.  Its connection to probability is as follows: if you take the PDF and normalize its area, so that the area under the curve (the integral) is unity, then the integral in a restricted range x 1 to x 2 is the probability that a given measurement will fall in that range.  Notice that we use P(x) for the probability, and p(x) for the probability density (PDF).  The median (  1/2 ) is the point where the probability is equal (i.e. 1/2) on each side:

February 08, 2010 Most Probable Value (Mode)  Most probability density functions (PDFs) have a single peak. The value of x at which they peak is the most probable value, or mode. This is the same as the mean for symmetric PDFs, but they can be quite different for asymmetric ones.  The most probable value is called  max, and obeys  Examples of when to use median vs. mean. For a set of measurements (sample distribution) that follows the Gaussian (Normal) distribution, the mean and median are basically the same, so long as the sample is large. However, the median is often preferred over the mean, as an estimate of the true value, in the presence of outliers. Say we have a set of measurements x = randn(1,100); You can check that the mean and median are nearly identical. Now say there was something wrong with the 42 nd measurement (x(42) = 300.;). Now the median is nearly unchanged, but the mean is much higher.

February 08, 2010 Deviations and RMS  If the parent distribution mean is , the deviations from the mean can be written  The average of the deviations, by virtue of the definition of the mean, must vanish:  Still, we may want to know what is the average absolution deviation, i.e. not consider the sign of the deviation, just the amount:  For computational purposes, it is better to define the square of the deviations (called the variance)  Then the standard deviation (also called RMS or root-mean-square deviation) is the square-root of the variance, .  To calculate the variance of the sample distribution, use: