Lecture 3 Calibration and Standards. The points (1,2) and (6,5) do not fall exactly on the solid line, but they are too close to the line to show their.

Slides:



Advertisements
Similar presentations
Modeling of Data. Basic Bayes theorem Bayes theorem relates the conditional probabilities of two events A, and B: A might be a hypothesis and B might.
Advertisements

The Maximum Likelihood Method
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Chapter 7 Statistical Data Treatment and Evaluation
CHEMISTRY ANALYTICAL CHEMISTRY Fall
The Normal Curve and Z-scores Using the Normal Curve to Find Probabilities.
Lecture 6 Basic Statistics Dr. A.K.M. Shafiqul Islam School of Bioprocess Engineering University Malaysia Perlis
CmpE 104 SOFTWARE STATISTICAL TOOLS & METHODS MEASURING & ESTIMATING SOFTWARE SIZE AND RESOURCE & SCHEDULE ESTIMATING.
Calibration Methods Introduction
P M V Subbarao Professor Mechanical Engineering Department
Lecture (14,15) More than one Variable, Curve Fitting, and Method of Least Squares.
Statistics: Data Analysis and Presentation Fr Clinic II.
Correlation 2 Computations, and the best fitting line.
Quality Control Procedures put into place to monitor the performance of a laboratory test with regard to accuracy and precision.
Procedure: 1.Measure blank. 2.Measure standard. 3.Measure unknown. 4.Subtract blank from standard and from unknown. 5.Calculate concentration of unknown.
CE 428 LAB IV Error Analysis (Analysis of Uncertainty) Almost no scientific quantities are known exactly –there is almost always some degree of uncertainty.
CALIBRATION METHODS.
Calibration & Curve Fitting
Least-Squares Regression
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
1 c. The t Test with Multiple Samples Till now we have considered replicate measurements of the same sample. When multiple samples are present, an average.
Copyright © 2006 The McGraw-Hill Companies, Inc. Permission required for reproduction or display. 1 ~ Curve Fitting ~ Least Squares Regression Chapter.
Copyright © 2010, 2007, 2004 Pearson Education, Inc. All Rights Reserved Section 10-3 Regression.
© Copyright McGraw-Hill CHAPTER 6 The Normal Distribution.
The normal distribution
Chapter 15 Modeling of Data. Statistics of Data Mean (or average): Variance: Median: a value x j such that half of the data are bigger than it, and half.
CS433: Modeling and Simulation Dr. Anis Koubâa Al-Imam Mohammad bin Saud University 15 October 2010 Lecture 05: Statistical Analysis Tools.
Quality Assurance How do you know your results are correct? How confident are you?
Regression Line I. Recap: Mean is _______________________________________________________________ Median is _____________________________________________________________.
CALIBRATION METHODS. For many analytical techniques, we need to evaluate the response of the unknown sample against the responses of a set of standards.
Section 9.3: Confidence Interval for a Population Mean.
Data Analysis and Presentation Chapter 5- Calibration Methods and Quality Assurance EXCEL – How To Do 1- least squares and linear calibration curve/function.
CHEMISTRY ANALYTICAL CHEMISTRY Fall Lecture 6.
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
Chapter 4 Statistics. Is my red blood cell count high today?
Chapter 4 Statistics Tools to accept or reject conclusion from experimental measurements Deal with random error only.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Economics 173 Business Statistics Lecture 10 Fall, 2001 Professor J. Petry
QUANTITATIVE ANALYSIS Determining amount of ions present in samples.
Advanced Statistical Methods: Continuous Variables REVIEW Dr. Irina Tomescu-Dubrow.
5 - 1 © 1998 Prentice-Hall, Inc. Chapter 5 Continuous Random Variables.
Variability Introduction to Statistics Chapter 4 Jan 22, 2009 Class #4.
R. Kass/W03 P416 Lecture 5 l Suppose we are trying to measure the true value of some quantity (x T ). u We make repeated measurements of this quantity.
© 2012 W.H. Freeman and Company Lecture 2 – Aug 29.
Chapter 14, continued More simple linear regression Download this presentation.
Richard Kass/F02P416 Lecture 6 1 Lecture 6 Chi Square Distribution (  2 ) and Least Squares Fitting Chi Square Distribution (  2 ) (See Taylor Ch 8,
Characteristics of Normal Distribution symmetric with respect to the mean mean = median = mode 100% of the data fits under the curve.
Linear Regression Hypothesis testing and Estimation.
2.4 Measures of Variation The Range of a data set is simply: Range = (Max. entry) – (Min. entry)
Data Analysis Student Text :Chapter 7. Data Analysis MM2D1. Using sample data, students will make informal inferences about population means and standard.
Week 2 Normal Distributions, Scatter Plots, Regression and Random.
The Maximum Likelihood Method
Physics 114: Lecture 13 Probability Tests & Linear Fitting
Linear Equation in Two Variables
The Maximum Likelihood Method
Statistics A statistic: is any number that describes a characteristic of a sample’s scores on a measure. Examples are but not limited to average (arithmetic.
Standard Normal Calculations
Simple Linear Regression - Introduction
Colorimetry and Beer’s Law
The Maximum Likelihood Method
Lecture 17 Spectrophotometry.
REGRESSION.
Graphing Review.
Regression Lecture-5 Additional chapters of mathematics
Discrete Least Squares Approximation
Criteria for tests concerning standard deviations.
Distance – Time Graphs Time is usually the independent variable (plotted on the x-axis) Distance is usually the dependent variable (plotted on the y-axis)
The Normal Distribution
Presentation transcript:

Lecture 3 Calibration and Standards

The points (1,2) and (6,5) do not fall exactly on the solid line, but they are too close to the line to show their deviations. The Gaussian curve drawn over the point (3,3) is a schematic indication of the fact that each value of y is normally distributed about the straight line. That is, the most probable value of y will fall on the line, but there is a finite probability of measuring y some distance from the line. Least-squares curve fitting Carl Friedrich Gauss in 1795

Least squares:

y=kx+b straight line equation

k = Slope =  y /  x b - blank! Let us subtract blank: y-b = Y = kx y=kx+b straight line equation Y 1 =kx 1 Y 2 =kx 2 One standard

Procedure: 1.Measure blank. 2.Measure standard. 3.Measure unknown. 4.Subtract blank from standard and from unknown. 5.Calculate concentration of unknown If you have several (N) standards, do it several (N) times

Standard addition: Why and when? Matrix (interfering components) can affect the slope In equation Y=kx you do not know k any more !

Use your sample as a new “blank”: Add a known amount to your sample XX + standard IxIx I x+standard Increase in intensity: because of this addition I x+standard – I x

Procedure: 1.Measure unknown. 2.Add a known amount to the unknown and measure this sample. 3. Subtract (2) from (1). 4. Calculate concentration of unknown If you have several (N) standard additions, do it several (N) times

Internal Standard Why and when? All intensities vary from sample to sample Sample 1 x s Sample 2 0.6x 0.6s Sample 3 1.2x 1.2s No reproducibility! Let us divide intensity in the first column by the intensity in the second: Sample 1 x s x/s Sample 2 0.6x 0.6s x/s Sample 3 1.2x 1.2s x/s Now they are the same!

Restriction: you need to measure 2 values simultaneously You may prefer to have the same amount of internal standard in all your samples Procedure: 1. Add equal amounts of the internal standard to all your standards and analytes. 2. Measure intensities of your target compound (atom) and your internal standard in your solutions. 3. For each pair of measurements, divide the intensity coming from your target compound by the intensity of the internal standard. 4. Process these new “normalized” intensities like you did before.

Good Plot! Bad plot! One VERY BAD point: Y = 1.5 x + 1 Least squares does not work?

b= 0.3; 1; 1 k = 2.33, 1.5; 1.50 Median: Y = 1.5 x + 1 A possible solution: “robust”

Straight line Any function Weighed least squares