Definition Metrology is the name given to the science of pure measurement. Engineering Metrology is restricted to measurements of length & angle Measurement is defined as the process of numerical evaluation of a dimension or the process of comparison with standard measuring instruments
Need of Measurement Establish standard Interchangeability Customer Satisfaction Validate the design Physical parameter into meaningful number True dimension Evaluate the Performance
Methods of Measurement Direct method Indirect method Comparative method Coincidence method Contact method Deflection method Complementary method
Direct method Measurements are directly obtained Ex: Vernier Caliper, Scales
Indirect method Obtained by measuring other quantities Ex : Weight = Length x Breadth x Height x Density
Comparative Method It’s compared with other known value Ex: Comparators
Coincidence method Measurements coincide with certain lines and signals Fundamental method Measuring a quantity directly in related with the definition of that quantity Contact method Sensor/Measuring tip touch the surface area
Complementary method The value of quantity to be measured is combined with known value of the same quantity Ex:Volume determination by liquid displacement
Deflection method The value to be measured is directly indicated by a deflection of pointer Ex: Pressure Measurement
Common elements of Generalized measuring system Primary sensing element Variable conversion element Variable manipulation element Data transmission element Data processing element Data presentation element
Primary sensing element Variable conversion element Variable manipulation element Temperature Data transmission element Observer Data presentation element Data processing element
Units and standards SI: fundamental Units Physical Quantity Unit Name Symbol length meter m mass kilogram kg time second s electric current ampere A temperature Kelvin K amount of substance mole mol luminous intensity candela cd
SI: Derived Units area square meter m2 volume cubic meter m3 speed Physical Quantity Unit Name Symbol area square meter m2 volume cubic meter m3 speed meter per second m/s acceleration second squared m/s2 weight, force newton N pressure pascal Pa energy, work joule J
Supplementary units Plane angle Radian Solid angle Steradian Physical Quantity Unit Name Symbol Plane angle Radian rad Solid angle Steradian sr
Standards International standards Primary standards Secondary standards Working standards
Measuring Instruments Deflection and null type instruments Analog and digital instruments Active and passive instruments Automatic and manually operated instruments Contacting and non contacting instruments Absolute and secondary instruments Intelligent instruments.
DEFLECTION AND NULL TYPE Physical effect generated by the measuring quantity Equivalent opposing effect to nullify the physical effect caused by the quantity
ANALOG AND DIGITAL INSTRUMENTS Physical variables of interest in the form of continuous or stepless variations Physical variables are represented by digital quantities
ACTIVE AND PASSIVE INSTRUMENTS Instruments are those that require some source of auxiliary power The energy requirements of the instruments are met entirely from the input signal
Automatic and manually operated Manually operated – requires the service of human operator Automated – doesn't requires human operator
Contacting And Non Contacting Instruments A contacting with measuring medium Measure the desired input even though they are not in close contact with the measuring medium
Absolute and Secondary Instruments These instruments give the value of the electrical quantity in terms of absolute quantities Deflection of the instruments can read directly A galvanometer is a type of ammeter: an instrument for detecting and measuring electric current. It is an analog electromechanical transducer that produces a rotary deflection of some type of pointer in response to electric current flowing through its coil in a magnetic field.
Intelligent instruments Microprocessors are incorporated with measuring instruments
Characteristics of Measuring Instrument Sensitivity Readability Range of accuracy Precision
Definition Sensitivity- Sensitivity is defined as the ratio of the magnitude of response (output signal) to the magnitude of the quantity being measured (input signal) Readability- Readability is defined as the closeness with which the scale of the analog instrument can be read
Definition Range of accuracy- Accuracy of a measuring system is defined as the closeness of the instrument output to the true value of the measured quantity Precision- Precision is defined as the ability of the instrument to reproduce a certain set of readings within a given accuracy
Sensitivity If the calibration curve is liner, as shown, the sensitivity of the instrument is the slope of the calibration curve. If the calibration curve is not linear as shown, then the sensitivity varies with the input.
Sensitivity This is the relationship between a change in the output reading for a given change of the input. (This relationship may be linear or non-linear.) Sensitivity is often known as scale factor or instrument magnification and an instrument with a large sensitivity (scale factor) will indicate a large movement of the indicator for a small input change.
Readability Readability is defined as the ease with which readings may be taken with an instrument. Readability difficulties may often occur due to parallax errors when an observer is noting the position of a pointer on a calibrated scale
Accuracy Accuracy = the extent to which a measured value agrees with a true value The difference between the measured value & the true value is known as ‘Error of measurement’ Accuracy is the quality of conformity
Precision The precision of a measurement depends on the instrument used to measure it. For example, how long is this block?
Accuracy vs. Precision High Accuracy High Precision High Precision Low Accuracy
Uncertainty The word uncertainty casts a doubt about the exactness of the measurement results True value = Estimated value + Uncertainty
Performance of Instruments All instrumentation systems are characterized by the system characteristics or system response There are two basic characteristics of Measuring instruments, they are Static character Dynamic character
Static Characteristics The instruments, which are used to measure the quantities which are slowly varying with time or mostly constant, i.e., do not vary with time, is called ‘static characteristics’.
STATIC CHARACTERISTICS OF AN INSTRUMENTS Accuracy Precision Sensitivity Resolution Threshold Drift Error Repeatability Reproducibility Dead zone Backlash True value Hysteresis Linearity Range or Span Bias Tolerance Stability
Resolution This is defined as the smallest input increment change that gives some small but definite numerical change in the output.
Threshold This minimum value of input below which no output can be appeared is known as threshold of the instrument. Output input
Drift Drift or Zero drift is variation in the output of an instrument which is not caused by any change in the input; it is commonly caused by internal temperature changes and component instability. Sensitivity drift defines the amount by which instrument’s sensitivity varies as ambient conditions change.
Error – The deviation of the true value from the desired value is called Error Repeatability – It is the closeness value of same output for same input under same operating condition Reproducibility - It is the closeness value of same output for same input under same operating condition over a period of time
Range The ‘Range’ is the total range of values which an instrument is capable of measuring.
Hysteresis This is the algebraic difference between the average errors at corresponding points of measurement when approached from opposite directions, i.e. increasing as opposed to decreasing values of the input. Actual/ Input Value Measured Value Ideal Hysteresis is caused by energy storage/ dissipation in the system.
Zero stability The ability of the instrument to return to zero reading after the measured has returned to zero
Dead band This is the range of different input values over which there is no change in output value.
Linearity- The ability to reproduce the input characteristics symmetrically and linearly
Backlash – Lost motion or free play of mechanical elements are known as backlash True value – The errorless value of measured variable is known as true value Bias – The Constant Error Tolerance- Maximum Allowable error in Measurement
Dynamic Characteristics The set of criteria defined for the instruments, which changes rapidly with time, is called ‘dynamic characteristics’.
Dynamic Characteristics Steady state periodic Transient Speed of response Measuring lag Fidelity Dynamic error
Steady state periodic – Magnitude has a definite repeating time cycle Transient – Magnitude whose output does not have definite repeating time cycle Speed of response- System responds to changes in the measured quantity
Measuring lag Retardation type :Begins immediately after the change in measured quantity Time delay lag : Begins after a dead time after the application of the input Fidelity- The degree to which a measurement system indicates changes in the measured quantity without error Dynamic error- Difference between the true value of the quantity changing with time & the value indicated by the measurement system
Errors in Instruments Error = True value – Measured value or Error = Measured value - True value
Types of Errors Error of Measurement Instrumental error Error of observation Based on nature of errors Based on control
Error of Measurement Systematic error -Predictable way in accordance due to conditions change Random error - Unpredictable manner Parasitic error - Incorrect execution of measurement
Instrumental error Error of a physical measure Error of a measuring mechanism Error of indication of a measuring instrument Error due to temperature Error due to friction Error due to inertia
Error of observation Reading error Parallax error Interpolation error
Nature of Errors Systematic error Random error
Based on control Controllable errors Non - Controllable errors Calibration errors Environmental (Ambient /Atmospheric Condition) Errors Stylus pressure errors Avoidable errors Non - Controllable errors
Correction Correction is defined as a value which is added algebraically to the uncorrected result of the measurement to compensate to an assumed systematic error. Ex : Vernier Caliper, Micrometer
Calibration Calibration is the process of determining and adjusting an instruments accuracy to make sure its accuracy is with in manufacturing specifications.
Interchangeability A part which can be substituted for the component manufactured to the small shape and dimensions is known a interchangeable part. The operation of substituting the part for similar manufactured components of the shape and dimensions is known as interchangeability.