Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW.

Slides:



Advertisements
Similar presentations
Estimation of Means and Proportions
Advertisements

Design of Experiments Lecture I
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Mean, Proportion, CLT Bootstrap
Sensitivity Analysis In deterministic analysis, single fixed values (typically, mean values) of representative samples or strength parameters or slope.
Sampling: Final and Initial Sample Size Determination
Mitigating Risk of Out-of-Specification Results During Stability Testing of Biopharmaceutical Products Jeff Gardner Principal Consultant 36 th Annual Midwest.
MARLAP Measurement Uncertainty
EXPERIMENTAL ERRORS AND DATA ANALYSIS
CHAPTER 6 Statistical Analysis of Experimental Data
Types of Errors Difference between measured result and true value. u Illegitimate errors u Blunders resulting from mistakes in procedure. You must be careful.
3-1 Introduction Experiment Random Random experiment.
Uncertainties of measurement in EXCEL
Inferences About Process Quality
The Calibration Process
1 Seventh Lecture Error Analysis Instrumentation and Product Testing.
11-1 Empirical Models Many problems in engineering and science involve exploring the relationships between two or more variables. Regression analysis.
Uncertainty analysis is a vital part of any experimental program or measurement system design. Common sources of experimental uncertainty were defined.
Measurement of Kinematics Viscosity Purpose Design of the Experiment Measurement Systems Measurement Procedures Uncertainty Analysis – Density – Viscosity.
Introduction to Regression Analysis, Chapter 13,
1 PREDICTION In the previous sequence, we saw how to predict the price of a good or asset given the composition of its characteristics. In this sequence,
University of Florida Mechanical and Aerospace Engineering 1 Useful Tips for Presenting Data and Measurement Uncertainty Analysis Ben Smarslok.
Chapter 6 Random Error The Nature of Random Errors
Physics 114: Lecture 15 Probability Tests & Linear Fitting Dale E. Gary NJIT Physics Department.
1 Chapter 1: Introduction to Design of Experiments 1.1 Review of Basic Statistical Concepts (Optional) 1.2 Introduction to Experimental Design 1.3 Completely.
Chapter 3 Scientific Measurement 3.1 Using and Expressing Measurements
Development of An ERROR ESTIMATE P M V Subbarao Professor Mechanical Engineering Department A Tolerance to Error Generates New Information….
Populations, Samples, Standard errors, confidence intervals Dr. Omar Al Jadaan.
Accuracy, Precision, and Error
Population All members of a set which have a given characteristic. Population Data Data associated with a certain population. Population Parameter A measure.
Combined Uncertainty P M V Subbarao Professor Mechanical Engineering Department A Model for Propagation of Uncertainty ….
Measurement and Its Uncertainties.
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Lecture 4 Basic Statistics Dr. A.K.M. Shafiqul Islam School of Bioprocess Engineering University Malaysia Perlis
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Statistical estimation, confidence intervals
Testing Hypotheses about Differences among Several Means.
Uncertainty & Error “Science is what we have learned about how to keep from fooling ourselves.” ― Richard P. FeynmanRichard P. Feynman.
Statistical analysis Outline that error bars are a graphical representation of the variability of data. The knowledge that any individual measurement.
1 Review from previous class  Error VS Uncertainty  Definitions of Measurement Errors  Measurement Statement as An Interval Estimate  How to find bias.
1 Chapter 1: Introduction to Design of Experiments 1.1 Review of Basic Statistical Concepts (Optional) 1.2 Introduction to Experimental Design 1.3 Completely.
3.1 Using and Expressing Measurements > 1 Copyright © Pearson Education, Inc., or its affiliates. All Rights Reserved. Chapter 3 Scientific Measurement.
OPENING QUESTIONS 1.What key concepts and symbols are pertinent to sampling? 2.How are the sampling distribution, statistical inference, and standard.
LECTURE 3: ANALYSIS OF EXPERIMENTAL DATA
Measurement of density and kinematic viscosity
Chapter Seventeen. Figure 17.1 Relationship of Hypothesis Testing Related to Differences to the Previous Chapter and the Marketing Research Process Focus.
ME Mechanical and Thermal Systems Lab Fall 2011 Chapter 3: Assessing and Presenting Experimental Data Professor: Sam Kassegne, PhD, PE.
Module 1: Measurements & Error Analysis Measurement usually takes one of the following forms especially in industries: Physical dimension of an object.
Statistical Inference Statistical inference is concerned with the use of sample data to make inferences about unknown population parameters. For example,
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
Statistics Presentation Ch En 475 Unit Operations.
From the population to the sample The sampling distribution FETP India.
CHAPTER 2.3 PROBABILITY DISTRIBUTIONS. 2.3 GAUSSIAN OR NORMAL ERROR DISTRIBUTION  The Gaussian distribution is an approximation to the binomial distribution.
Errors. Random Errors A random error is due to the effects of uncontrolled variables. These exhibit no pattern. These errors can cause measurements to.
Lecture 8: Measurement Errors 1. Objectives List some sources of measurement errors. Classify measurement errors into systematic and random errors. Study.
CHAPTER- 3.1 ERROR ANALYSIS.  Now we shall further consider  how to estimate uncertainties in our measurements,  the sources of the uncertainties,
Chapter 6: Random Errors in Chemical Analysis. 6A The nature of random errors Random, or indeterminate, errors can never be totally eliminated and are.
3.1 Using and Expressing Measurements > 1 Copyright © Pearson Education, Inc., or its affiliates. All Rights Reserved. Chapter 3 Scientific Measurement.
Fundamentals of Data Analysis Lecture 11 Methods of parametric estimation.
MECH 373 Instrumentation and Measurements
Two-Sample Hypothesis Testing
UNCERTAINTY ANALYSIS: A BASIC OVERVIEW presented at CAVS by GLENN STEELE August 31, 2011 Copyright 2011 by Coleman.
CHAPTER 29: Multiple Regression*
Introduction to Instrumentation Engineering
Filtering and State Estimation: Basic Concepts
Summary of Experimental Uncertainty Assessment Methodology
CHAPTER – 1.1 UNCERTAINTIES IN MEASUREMENTS.
Measurements & Error Analysis
Propagation of Error Berlin Chen
Propagation of Error Berlin Chen
Presentation transcript:

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 1 UNCERTAINTY ANALYSIS: A BASIC OVERVIEW presented at CAVS by GLENN STEELE August 31, 2011

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 2 EXPERIMENTAL UNCERTAINTY REFERENCES The ISO GUM: The de facto international standard

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 3 EXPERIMENTAL UNCERTAINTY REFERENCES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 4 VALIDATION REFERENCES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 5 VALIDATION REFERENCES

Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 6 “Degree of Goodness” When we use experimental results (such as property values) in an analytical solution, we should consider “how good” the data are and what influence that degree of goodness has on the interpretation and usefulness of the solution When we compare model predictions with experimental data, as in a validation process, we should consider the degree of goodness of the model results and the degree of goodness of the data.

Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 7 Typical comparison of predictions and data, considering no uncertainties: Result, C D Set point, Re

Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 8 Comparison of predictions and data considering only the likely uncertainty in the experimental result: Result, C D Set point, Re Uncertainties set the resolution at which meaningful comparisons can be made.

Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 9 Validation comparison considering all uncertainties: S  value from the simulation D  data value from experiment E  comparison error E = S - D =  S -  D where (  S =  model +  input +  num ) U Re USUS UCDUCD Result, C D Set point, Re

Copyright 2010 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 10 “Degree of Goodness” and Uncertainty Analysis When an experimental approach to solving a problem is to be used, the question of “how good must the results be?” should be answered at the very beginning of the effort. This required degree of goodness can then be used as guidance in the planning and design of the experiment. We use the concept of uncertainty to describe the “degree of goodness” of a measurement or an experimental result.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 11 ERRORS & UNCERTAINTIES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 12 An error  is a quantity with a sign and magnitude. (We assume any error whose sign and magnitude is known has been corrected for, so the errors that remain are of unknown sign and magnitude.) An uncertainty u is an estimate of an interval  u that should contain .

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 13 Consider making a measurement of a steady variable X (whose true value is designated as X true ) that is influenced by errors  i from 5 elemental error sources. Postulate that errors  1 and  2 do not vary as measurements are made, and  3,  4, and  5 do vary during the measurement period:

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 14 The total error (  ) is the sum of –  (=  1 +  2 ) the systematic, or fixed, error –  (=  3 +  4 +  5 ) the random, or repeatability, error  =  +   varies) β (does not vary) 11 22

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 15 The k th measurement of X then appears as The total error (  k ) is the sum of –  k the systematic, or fixed, error –  k the random, or repeatability, error

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 16 Central Limit Theorem   statistics   ???

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 17 Histogram of temperatures read from a thermometer by 24 students

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 18  varies) β (does not vary) 11 22 Now consider again making the measurements of X

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 19 We can calculate the standard deviation s X of the distribution of N measurements of X and that will correspond to a standard uncertainty (u) estimate of the range of the  i ’s. We will call s X the random standard uncertainty.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 20

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 21 We will estimate systematic standard uncertainties corresponding to the elemental systematic errors  i and use the symbol b i to denote such an uncertainty. Thus ±b 1 will be an uncertainty interval that should contain  1, ±b 2 will be an uncertainty interval that should contain  2, and so on.... The systematic standard uncertainty b i is understood to be an estimate of the standard deviation of the parent population from which the systematic error  i is a single realization.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 22 The standard uncertainty in X -- denoted u X -- is defined such that the interval ± u X contains the (unknown) combination and, in accordance with the GUM, is given by

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 23 Categorizing and Estimating Uncertainties in the Measurement of a Variable GUM categorization by method of evaluation: –Type A  “method of evaluation of uncertainty by the statistical analysis of series of observations” –Type B  “method of evaluation of uncertainty by means other than the statistical analysis of series of observations” Traditional U.S. categorization by effect on measurement: –Random (component of) uncertainty  estimate of the effect of the random errors on the measured value –Systematic (component of) uncertainty  estimate of the effect of the systematic errors on the measured value Both are useful, and they are not inconsistent. Use of both will be illustrated in the examples in this course.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 24 An Additional Uncertainty Categorization In the fields of Risk Analysis, Reliability Engineering, Systems Safety Assessment, and others, uncertainties are often categorized as Aleatory –Variability –Due to a random process Epistemic –Incertitude –Due to lack of knowledge

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 25 Uncertainty Categorization 100 % The key is to identify the significant errors and estimate the corresponding uncertainties – whether one divides them into categories for convenience of Random – Systematic Type A – Type B Aleatory – Epistemic Lemons – Chipmunks should make no difference in the overall estimate u if one proceeds properly.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 26 OVERALL UNCERTAINTY OF A MEASUREMENT At the standard deviation level Systematic Standard Uncertainty = (for 2 elemental systematic errors) Random Standard Uncertainty = s X (or ) Combined Standard Uncertainty = u X Overall or Expanded Uncertainty at C % confidence

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 27 For large samples, assuming the total errors in the measurements have a roughly Gaussian distribution, and using a 95% confidence level, k 95 = 2 and The true value of the variable will then be within the limits about 95 times out of 100. To obtain a value of the coverage factor k, an assumption about the form of the distribution of the total errors (the  ’s) in X is necessary.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 28 RESULT DETERMINED FROM MULTIPLE MEASURED VARIABLES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 29 We usually combine several variables using a Data Reduction Equation (DRE) to determine an experimental result. These have the general DRE form There are two approaches used for propagating uncertainties through the DREs: –the Taylor Series Method (TSM) –the Monte Carlo Method (MCM)

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 30 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. For the case where the result r is a function of two variables x and y r = f(x,y) the combined standard uncertainty of the result, u r, is given by where s r is calculated from multiple result determinations and the b x and b y systematic standard uncertainties are determined from the combination of elemental systematic uncertainties that affect x and y as and TAYLOR SERIES METHOD OF UNCERTAINTY PROPAGATION

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 31 Monte Carlo Method of Uncertainty Propagation

32 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. Applying General Uncertainty Analysis – Experimental Planning Phase

33 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. GENERAL UNCERTAINTY ANALYSIS For a result given by a data reduction equation (DRE) the uncertainty is given by Example DRE Note that (assuming the large sample approximation) the U in the propagation equation can be interpreted as the 95% confidence U 95 = 2 u or as the standard uncertainty u as long as each term in the equation is treated consistently.

34 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. Example It is proposed that the shear modulus, M S, be determined for an alloy by measuring the angular deformation  produced when a torque T is applied to a cylindrical rod of the alloy with radius R and length L. The expression relating these variables is We wish to examine the sensitivity of the experimental result to the uncertainties in the variables that must be measured before we proceed with a detailed experimental design. The physical situation shown below (where torque T is given by aF) is described by the data reduction equation for the shear modulus

35 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

36 ESTIMATING RANDOM UNCERTAINTIES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 37 Data sets for determining estimates of standard deviations and random uncertainties should be acquired over a time period that is large relative to the time scales of the factors that have a significant influence on the data and that contribute to the random errors.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 38 Direct Calculation Approach for Random Uncertainty For a result that is determined M times the mean value of the result is and

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 39 ESTIMATING SYSTEMATIC UNCERTAINTIES

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 40 Propagation of systematic errors into an experimental result:

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 41 The systematic standard uncertainties for the elemental error sources are estimated in a variety of ways that were discussed in some detail in the course. Among the ways used to obtain estimates are: use of previous experience, manufacturer’s specifications, calibration data, results from specially designed “side” experiments, results from analytical models, and others.

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 42 Recall the definition of a systematic standard uncertainty, b. It is not the most likely value of , nor the maximum value. It is the standard deviation of the assumed parent population of possible values of .

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 43 SYSTEMATIC STANDARD UNCERTAINTY

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 44

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 45 Correlated Systematic Errors Typically occur when different measured variables share one or more elemental error sources –multiple variables measured with same transducer probe traversed across flow field multiple pressures ported sequentially to the same transducer (scanivalve) –multiple transducers calibrated against same standard electronically scanned pressure (ESP) systems in use in aerospace ground test facilities Examples –q = m C p (T o – T i ) – –u’v’

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 46 Using the TSM, there is a term in the b r 2 equation for each pair of variables in the DRE that might share an error source: For q = m C p (T o – T i ) For For u’v’....

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 47

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 48 Some Final Practical Points on Estimating Systematic Uncertainties When estimating b, we are not trying to estimate the most probable value nor the maximum possible value of  Always remember to view and use estimates with common sense. For example, a “% of full scale” b should not apply near zero if the instrument is nulled. Resources should not be wasted on obtaining good uncertainty estimates for insignificant sources – a practice we have observed too many times….

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 49 “V&V” – Verification & Validation: The Process Preparation –Specification of validation variables, validation set points, etc. (This specification determines the resource commitment that is necessary.) –It is critical for modelers and experimentalists to work together in this phase. The experimental and simulation results to be compared must be conceptually identical. Verification –Are the equations solved correctly? (MMS for code verification. Grid convergence studies, etc, for solution verification to estimate u num.) Validation –Are the correct equations being solved? (Compare with experimental data and attempt to assess  model ) Documentation

Copyright 2008 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 50 A Validation Comparison

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 51 V&V Overview – Sources of Error Shown in Ovals

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 52 Isolate the modeling error, having a value or uncertainty for everything else E=S-D =  model + (  input +  num -  D )  model = E - (  input +  num -  D ) If ± u val is an interval that includes (  input +  num -  D ) then  model lies within the interval E ± u val Strategy of the Approach E ± u val

Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission. 53 Uncertainty Estimates Necessary to Obtain the Validation Uncertainty u val Uncertainty in simulation result due to numerical solution of the equations, u num (code and solution verification) Uncertainty in experimental result, u D Uncertainty in simulation result due to uncertainties in code inputs, u input Propagation by (A)Taylor Series (B)Monte Carlo

Methodology Simulation Uncertainty Modeling error for uncalibrated model used to make calculations between validation points where u sp = uncertainty contribution from the uncertainty of input parameters at the simulation calculation point and u E = uncertainty in E at the calculation point from the interpolation process 54 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

55 Uncertainty of Calibrated Models

Methodology Instrument Calibration Analogy Uncalibrated instrumentation system where u t = uncertainty of the transducer and u m = uncertainty of the meter Calibrated instrumentation system where u c is the calibration uncertainty 56 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

Methodology Instrument Calibration Analogy If a curve-fit is used to develop a relationship between the meter reading and the calibrated output value, then where u cf = the curve-fit uncertainty If the meter used in testing (m 2 ) is different from the meter used in calibration (m 1 ), then 57 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

Methodology Instrument Calibration Analogy The uncertainties, u, in the previous expressions are standard uncertainties, at the standard deviation level. To express the uncertainty at a given confidence level, such as 95%, the standard uncertainty is multiplied by an expansion factor. For most engineering applications, the expansion factor is 2 for 95% confidence. 58Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

Methodology Calibrated Model To calibrate a model, the simulation results are compared with a set of data and corrections are applied to the model to make it match the data. The simulation uncertainty is then As in the curve-fit uncertainty in the calibration of a transducer, there will be additional uncertainty in the calibrated model based on the error between the corrected simulation results and the data. 59 Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.

Methodology Calibrated Model would apply for simulation results over the range of the input parameter values used in the calibration of the model with the assumption that the input parameters in the simulation have the same uncertainties that they had in the calibration process. If the input parameter sources or transducers change for a simulation result, then 60Copyright 2011 by Coleman & Steele. Absolutely no reproduction of any portion without explicit written permission.