Elec 471 Embedded Computer Systems Chapter 1, Basic Concepts

Slides:



Advertisements
Similar presentations
Instrumental Analysis
Advertisements

Calibration methods Chemistry 243.
Design of Experiments Lecture I
Lecture Notes Part 4 ET 483b Sequential Control and Data Acquisition
ENT 164/4 SENSORS & MEASUREMENT
Chapter 7 Statistical Data Treatment and Evaluation
Errors in Chemical Analyses: Assessing the Quality of Results
Learning Objectives Static and Dynamic Characteristics of Signals
Regression Analysis Once a linear relationship is defined, the independent variable can be used to forecast the dependent variable. Y ^ = bo + bX bo is.
Regression Analysis Using Excel. Econometrics Econometrics is simply the statistical analysis of economic phenomena Here, we just summarize some of the.
Characteristics of Instruments P M V Subbarao Professor Mechanical Engineering Department A Step Towards Design of Instruments….
Primary and Derived Measures Terminology Prioritization 1.
1 Fourth Lecture Static Characteristics of Measurement Systems (continued) Instrumentation and Product Testing.
Basic Measurement Concepts ISAT 253 Spring Dr. Ken Lewis Mod. 2 Measurement Concepts So far… In the Design of Experiments In the Design of Experiments.
Types of Errors Difference between measured result and true value. u Illegitimate errors u Blunders resulting from mistakes in procedure. You must be careful.
ELECTRONIC INSTRUMENTATION
1 Seventh Lecture Error Analysis Instrumentation and Product Testing.
Classification of Instruments :
Chemometrics Method comparison
Elec471 Embedded Computer Systems Chapter 4, Probability and Statistics By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and.
Instrumentation and Measurements
V. Rouillard  Introduction to measurement and statistical analysis ASSESSING EXPERIMENTAL DATA : ERRORS Remember: no measurement is perfect – errors.
Statistical Methods For Engineers ChE 477 (UO Lab) Larry Baxter & Stan Harding Brigham Young University.
Calibration Savvy. Calibration and Conformance  Calibration: Check a measurement against a known universally recognized standard to determine any deviation.
INTRODUCTION TO MEASUREMENT
Accuracy and Precision
Performance characteristics for measurement and instrumentation system
Part 1: Basic Principle of Measurements
LECTURER PROF.Dr. DEMIR BAYKA AUTOMOTIVE ENGINEERING LABORATORY I.
Chapter 5 Errors In Chemical Analyses Mean, arithmetic mean, and average (x) are synonyms for the quantity obtained by dividing the sum of replicate measurements.
INTRODUCTION MEASUREMENT STANDARDS AND UNITS.
Uncertainty & Error “Science is what we have learned about how to keep from fooling ourselves.” ― Richard P. FeynmanRichard P. Feynman.
Instrumentation and Measurements Class 3 Instrument Performance Characteristics.
Lecture 3 Mechanical Measurement and Instrumentation MECN 4600 Department of Mechanical Engineering Inter American University of Puerto Rico Bayamon Campus.
MEASUREMENT STANDARDS AND UNITS. Chapter Objectives Contents  To define some measurement terms  To describe basic measurement units and relate to derivative.
Ping Zhu, AHC5 234, Office Hours: M/W/F 10AM - 12 PM, or by appointment M/W/F,
Lecture I Sensors.
INSTRUMENTATION Introduction to Instrumentation Syarifah Norfaezah
Module 1: Measurements & Error Analysis Measurement usually takes one of the following forms especially in industries: Physical dimension of an object.
Aerospace Engineering Laboratory I  Basics for Physical Quantities and Measurement  Physical Quantity  Measured Quantity VS Derived Quantity.
Review of fundamental 1 Data mining in 1D: curve fitting by LLS Approximation-generalization tradeoff First homework assignment.
Validation Defination Establishing documentary evidence which provides a high degree of assurance that specification process will consistently produce.
BME 353 – BIOMEDICAL MEASUREMENTS AND INSTRUMENTATION MEASUREMENT PRINCIPLES.
Measurements Measurements and errors : - Here, the goal is to have some understanding of the operation and behavior of electrical test instruments. Also,
EMT 462 ELECTRICAL SYSTEM TECHNOLOGY Part 2: Instrumentation By: En. Muhammad Mahyiddin Ramli.
Basic Concepts: Definitions Readability (R): indicates the closeness with which the scale of the instrument may be read.Readability (R): indicates the.
1 Chapter 01 Measurement And Error. 2 Summary Instrument – a device or mechanism used to determine the present value of a quantity Measurement – a process.
MEASUREMENT AND ERROR CHARACTERISTICS OF MEASURING INSTRUMENTS ANALYSIS OF MEASURED DATA UNCERTAINTY ANALYSIS.
Instrument Characteristics  Scientific Instrument: l A device for making a measurement.  Measurement: l An action intended to assign a number as the.
EKT 314/4 WEEK 1 : CHAPTER 1 INTRODUCTION TO EI ELECTRONIC INSTRUMENTATION.
Chapter 5: Errors in Chemical Analysis. Errors are caused by faulty calibrations or standardizations or by random variations and uncertainties in results.
1 DATA ANALYSIS, ERROR ESTIMATION & TREATMENT Errors (or “uncertainty”) are the inevitable consequence of making measurements. They are divided into three.
Introduction to Lab Techniques Measurements and Calibration.
MECH 373 Instrumentation and Measurements
Instrumentation & Measurement
SI Measurement System Presentation Name Course Name
MECH 373 Instrumentation and Measurement
Mathematical Toolkit Chapter 2 Pg
Lesson 2: Performance of Control Systems
Characteristics of measurement systems
BASICS OF MEASUREMENT AND INSTRUMENTATION
Introduction to Instrumentation Engineering
Introduction to Electrical Measurement
Mechanical Measurements and Metrology
Lesson 10: Sensor and Transducer Electrical Characteristics
Measurements Measurements and errors :
Measurements & Error Analysis
MECH 373 Instrumentation and Measurement
Basic Steps in Development of Instruments
Primary and Derived Measures Terminology Prioritization
Presentation transcript:

Elec 471 Embedded Computer Systems Chapter 1, Basic Concepts By Prof. Tim Johnson, PE Wentworth Institute of Technology Boston, MA Theory and Design for Mechanical Measurement by Richard Figliola

Content Basic Concept of Measurement Methods General Measurement System Experimental Test Plan Variable, Parameters, Noise & Interference, Random Tests, Replication and Repetition, Concomitant Methods Calibration Static & Dynamic Calibration, Static Sensitivity, Range, resolution, Accuracy/Error, Types of Errors/Uncertainty, Sequential and Random Tests Standards Base Dimensions and their Units Derived Units Hierarchy of Standards Test Standards and Codes Presenting Data Coordinated Formats Significant digits

Basic Concepts of Measurement Methods General familiarity with ordinary measurements Specifics and modern needs demand greater attention to the methods of measurement Measurement methods seek to answer the questions: How does one establish the relationship between the real value of a variable and the value actually measured? Can a measurement be devised so that the measurement system provides unambiguous information?

General Measurement System The MSP430 16-bit analog to digital converter (ADC) contains signal conditioning and output stages which along with a C-program will provide a control stage.

Quick Definitions Sensor—a physical device that is sensitive to the process being measured. Changes within the sensor can be detected because they are connected to the transducer. Transducer—converts the sensed information into a signal (electrical, mechanical, optical or other). Signal Conditioning—amplifies the transducer signal so it can be read and minimizes the noise by filtering. If digitized this stages writes a data value to memory. Output—converts the conditioned signal into a format useful for display. Control—evaluates the conditioned signal (the data value) to control the process either manually or automatically via computer program.

Complete Measurement System Example

Experimental Test Plan Answers a question such as what is the optimum speed to minimize fuel consumption? Here are the steps once the question is formulated: Parameter Design Plan Identify process variables and limits (parameters allowed range) Identify a means for control System and Tolerance Design Plan Selection of a measurement technique Identification of equipment and test procedure Tolerance desired Data Reduction Design Plan Analysis of data Data collection Sample frequency Frequency response Information display System and data evaluation

Process Variables Variable are entities that influence the test. The measured variable is the targeted variable. If a change in one variable doesn’t affect another variable then the variables are independent. A variable that is affected by changes in one or more other variables is a dependent variable. A variable may be continuous meaning it changes all the time or it may be quantized where it changes between discrete values.

Variables continued A variable is controlled if it can be held at a constant value during a measurement. The relationship (mathematical model/equation) between independent variables and a dependent variable can be determined by taking measurement while stepping the independent variable value. Variable that affect the value of the target variable but can not be controlled are called extraneous variable. These variables can introduce difference in repeated tests.

Example of an Extraneous Variable

Parameter Defined A parameter is a functional grouping of variables. Examples Moment of Inertia—object’s resistance to a change of its rotation, scalar formula ½mv2 Volume—formula V = length*width*height Control parameter has an effect on the behavior of the measured variable. A parameter is controlled if its value can be maintained during a set of measurement.

How to develop a parameter Fan flow rate, Q, is a function of the rotational speed, n in rpm, and diameter, d, formula: Q = nd3 (text formula) Fan flow coefficient, C, is a parameter that indicates efficiency, formula C = Q/nd3. The variables, d and n, can then be held constant one at a time to observe their effects on C. Lesson: by having a formula, a parameter is developed by dividing the outcome of an equation by its inputs.

Noise and Interference Noise is a random variation in the measured signal as a consequence of extraneous variables. Noise increases data scatter about a data plot; variance of data reading from average value. Noise has a short time duration and affects only a few data points. Interference is an undesired trend of data point derived from the measure signal from the ideal value. Interference has a longer time duration than noise and observes a deterministic trend.

Random Tests A random test is a form of randomization. Needed to detect when a target variable dependent on several independent variable that may be effect by other extraneous variables unknown to the system designers. Examples are the use of different instruments, test operators, and test operating conditions. A random test is defined as measurement matrix that sets a random order to the change in the value of the independent variable applied. Additionally, organizing a test matrix in random blocks where a data set of developed with a controlled variable varied but extraneous variable is fixed then repeated with the extraneous variable stepped through a range.

Replication and Repetition Rule: the estimated value of a measured variable improves with the number of measurements. Repeated measurements during a test run are called repetition. Repetition helps to quantify the variation in a measured variable while the operating conditions are held under nominal control. Repetition will not permit an assessment of how exact the operating conditions can be set. Changing the operating conditions, such as, machines, and/or operators is called Replication.

Concomitant Methods Are the results good? Using different measurement methodologies that can provide an estimate of the target variable which can be compared will show the degree of agreement in the final estimate. Analysis of the difference can reveal hidden extraneous variables in one or the other measurement test. This methodology employs a double-check policy by using different equipment.

Calibration Is the application of a known input value to a measurement system for the purpose of observing the system output value. Calibration establishes the relationship between the input and output values. Calibration using an acknowledged known value (called a standard) can verify a system output. In a calibration the input value is usually a controlled independent variable. The output would be the dependent variable of the calibration.

Static Calibration Is when the value of the input variable involved in a calibration remain constant: not varying in time or space. Only the input and output magnitudes are considered in the evaluation. During a measurement test set by varying the input values you can develop a calibration curve.

Calibration Curve A calibration curve describes the static input-output relationship for a measurement systems. A formal equation can be developed from a calibration curve modeling the input-output relationship. Error measurements of data sets points plotted vs. a calibration curve can be developed using Excel statistical functions. The correlation can be used to ascertain an unknown input value based on an output value using the calibration curve or mathematical model. The slope of the curve is Static Sensitivity K= dy/dx

Dynamic Calibration This type of calibration uses an input variable that varies with time or space. The input will vary in magnitude and/or frequency content. A dynamic calibration determines the relationship between a dynamic input and the measurement system output. This topic is covered in Chapter 3.

Range and Resolution A calibration range is inputs of known minimum to maximum values for which a measurement system is to be used. These values define the operating range which is the span of ri = xmax - xmin. The output span, aka, full-scale-operating range (FSO) is ro = ymax - ymin. Resolution is the smallest increment in the measured value that can be discerned. It is quantified by the smallest scale increment of the output readout indicator.

Accuracy and Error The exact value of a variable is called the true value. The value of a variable as indicated by a measurement system is called the measured value. The accuracy of a measurement refers to the closeness of agreement between these two values. The error, e, is the difference between the measured value and the true value: e = measured value – true value You can tell a person is an engineer because they will always ask “What’s the error?” at some point in a conversation. If the person answering this question provides an answer you can be certain that they are an engineer also.

Random error Random error is a measure of the random variation found during repeated measurements of a variable. The repeatability of a measurement system refers to its ability to indicate the same value on repeated measurements for a specific value of input. This is also referred to as precision. The portion of the absolute error, |e|, that remains constant on repeated measurements is called the systematic error. This error is a bias that can not be discerned from repeated measurements. However, an offset between the apparent average of the readings and the true value is a measure of the systematic error.

Linearity error A linear system has a relationship between the input and output that can be expressed mathematically as: yLinear = mx + b Actually systems y(x) measurement data sets are often close to a linear fit so the linear error is eL(x) = y(x) – yLinear The linear equation, yLinear, is a result of a best fit for a linear equation on calibration data.

Sequential Test & Hysteresis error A sequential test applies an incremental change to the input value over a desired range by incrementing or decrementing the input by an incremental value (which is usually small when compared to the input value). This test can determine hysteresis error, eh, which is a result of retention by the system of a portion of the output value from the previous input. eh= (y)upscale –(y)downscale

Uncertainty There is an uncertainty in the error for a measurement because the true value is not known exactly and a reference value is used instead during the instrument’s calibration. Uncertainty refers to the estimate of the sum of all the errors present in the measurement system, its calibration, and the measurement technique. It is a property of the test results. After calibration, an error can be estimated to be bounded by a ± range of the indicated reading with a certain confidence. This quantifies the uncertainty.

Sensitivity error This is an error, eK, in the slope of the calibration curve. This could be caused by temperature changes in the ambient temperature. The “thermal” sensitivity error can be found by calibration at different temperatures. Other source could cause a change in the slope and their error estimation is dependent on determining the source. (Sensitivity error graph shown with fixed zero intercept)

Zero error If the zero intercept (y = 0 when x is zero) drifts meaning the variable b changes in the linear equation y = mx + b, this is called a zero error, ez. This results in a vertical shift of the calibration curve as seen here: This error is detected by repeated random tests.

Repeatability Instrument repeatability, %eRmax, is the ability of a measurement system to indicate the same value upon retesting using the same input. Specific claims of repeatability are based on multiple calibration tests: replication. A standard deviation, Sx, is developed. The formula uses Full Scale Output, ro, to calculate the percentage: %eRmax = 2Sx/ro x 100

Overall instrument error An estimate of the overall instrument error, uc is made by combining all known errors. This results in a number that is an uncertainty. An estimate is computed from the square root of the sum of the squares of all known errors. For m known errors: Uc = [ (e1)2 + (e2)2 + … +(em)2 ]½

Standards When a measurement system is calibrated its indicated value is compared directly with a reference value. This reference value forms the basis of the comparison and is known as the standard. This standard is based on long standing observations of instruments and published values from standard organizations such as the ISO, the International Organization for Standardization, the National Institute of Standards and Technology (NIST), American Society of Testing and materials (ASTM), and others. These organizations published values for physical phenomenon such as conversion values and procedures for quality assurance ISO 2000, ISO 2001 and many others. Section 1.5 in the text covers numerous details about these standards but a visit online to the original sources is encouraged. Note that there are original measurements, secondary sources, transfer standards, and working standards used by calibration laboratories.

Metrology Metrology is the science of measurement. It deals with physical measurement standards (not documentary standards). Measurements play a key role in modern life; in industry as well as in trade and in society in general, in assuring quality and safety. There is a growing need in science and technology for increasingly accurate and more complex measurements. There are seven base units of measurement, which form the base of the International System of Measurement (SI). These include: Meter (length) Kilogram (mass) Second (time) Ampere (electric current) Kelvin (temperature) Mole (amount of substance) Candela (luminous intensity) Many derived units, such as force, pressure, watt, and volt can be formed by combining the base units. Signed in 1875, the Treaty (or Convention) of the Meter remains the basis of all International agreement on units of measurement. Countries are represented in the treaty by their National Measurement Institutes (NMI) that have the responsibility for maintaining national measurement standards. In the United States, Congress gave this responsibility to NIST in 1901. http://gsi.nist.gov/global/index.cfm/L1-5/L2-47

Research (aka Homework) Suppose you were to develop a product…that had a power cord attached to it so the product could be turned on. Find out what organization would be responsible for publishing the documentation for the cord. Find out the standard number(s) that would apply to the power cord. What is the procedure to obtain a UL or EU certification listing.