Accuracy-Concepts: - ISO (5725-1) "Closeness of Agreement Between A Test Result and The Accepted

Download as doc, pdf, or txt
Download as doc, pdf, or txt
You are on page 1of 7

Accuracy- Concepts

Accuracy of a measuring instrument: is a qualitative indication of the ability of a measuring instrument to give responses close to the true value of the measurand (parameter being measured.) [VIM, 5.18] This accuracy is a design specification and it is what is verified during calibration. Accuracy of a measurement: is a qualitative indication of how closely the result of a measurement agrees with the true value of the measurand. [VIM, 3.5] Because the true value is always unknown, accuracy of a measurement is always an estimate. An accuracy statement by itself has no meaning other than as an indicator of quality. It has quantitative value only when accompanied by information about the uncertainty of the measuring system

Definition of Accuracy ISO[5725-1] closeness of agreement between a test result and the accepted reference value

ISO[3534-2] closeness of agreement between a test result or measurement result and the true value

[VIM 3] closeness of agreement between a measured quantity value and a true quantity value of a measurand Terms related to Accuracy Precision Resolution Error Bias Tolerance Uncertainty Precision The degree to which an instrument will repeat the same measurement over a period of time under the same conditions.

Resolution The smallest change in a measured value that the instrument can detect. Resolution is also known as sensitivity. Resolution is 0.01 Volt

Resolution is 1 Volt

Error Difference between measured value and accepted reference value in an instance is called as 'ERROR'. E = 5.0 4.6

= 0.4 Volts Bias The difference between the mean of repeat measurement results and the accepted reference value or true value is called the bias of the measurement process. Tolerance The unwanted but acceptable deviation from a desired dimension. It is the probability of largest error in a measurement. It can be estimated as half of the largest measurement and smallest measurement. T = 1/2 ( largest measurement - smallest measurement)

Uncertainty Uncertainty of measurement [Sec3.9, VIM]: Parameter, associated with the result of a measurement, that characterizes the dispersion of the values that could reasonably be attributed to the measurand Calibration determines the measurement values deviation from a standard with a known value in order to assess and apply a systematic correction to those measurements. The associated uncertainty of every calibration result is also calculated. The calibration certicate compiles measurement results with their associated expanded uncertainties regardless of the instrument specications. Uncertainty Sources Environment Reference Equipment Metrologist / Operator Measurement Procedure Accuracy in measurements Measurements with adequate accuracy improves credibility of reporting numerical results and their application in research / testing. But how to ensure that they are adequate ? By using equipments with stated accuracy Reproducing results with other standard methods

Significance of Accuracy By knowing the accuracy in a measurement, one will be always aware of the deviation in the measurement results. Hence improves the confidence in measurement results or performance of equipments. Note: In short accuracy is one of the contributing element in uncertainty estimations. Applicable to Equipments Methods Ideally a measuring device should be both accurate and precise, with all measurements close to and tightly clustered around the reference value. How accuracy can be specified According to equipment literature / manuals According to the requirement As per reference/standard literature As per norms specified by standards (TAR/TUR)

Test Accuracy Ratio The comparison between the accuracy of the Unit Under Test/Calibration (UUT/UUC) and the accuracy of the standard is known as a Test Accuracy Ratio (TAR). However, this ratio does not consider other potential sources of error in the calibration process. Test Uncertainty Ratio The comparison between the accuracy of the UUT and the estimated calibration uncertainty is known as a Test Uncertainty Ratio (TUR). Based on ANSI/NCSL Z540-1-1994, Collective uncertainty of the measurement standards shall not exceed 25% of the acceptable tolerance (e.g. Manufacturer specifications). This 25% represents a TUR of 4:1.
TUR = Tolerancef ortheMeasu rand TaskSpecif icUncertai nty

Equipment Literature Scenario We want to extract plasma from blood using a centrifuge with following conditions. Speed : 3500 rpm 10 rpm Temperature : 22 to 25 degrees Time : 20 to 60 minutes A centrifuge with its specification is shown. Can it be used for the specific purpouse ? Specifications from Manual Capacity 1.6L (4 x 400mL) Control Microprocessor Display Digital Max. Speed 15,200rpm Speed Accuracy 1% of Reading Temperature 25C 5C Timer 9 hr., 99 min. Noise Level <61dBA Cant be use.. because.. the accuracy is 1 %.. but the 1% of 3500rpm is 35.. so it is greater than the requirement. Then another one Scenario We want to measure the temperature of a water bath. The work procedure describes the the incubation should be done in the water bath in 37C 1C. Maximum set temperature for Bath is 100C

Two thermometers are shown which one to use ? Thermometer 1 Temperature Range : 0C to 100C Resolution : 0.1 C Accuracy : 0.1 C Thermometer 2 Temperature Range : 0C to 100C Resolution : 0.02 C Accuracy : 0.05 C Both Thermometer 1 and Thermometer 2 will satisfy the requirement, 37C 1C. Expressing Accuracy as a tolerance limit as a % of Nominal Capacity as a % of measurement reading as a % of operating range as a combination of any of the above.

You might also like