top of page
  • Writer's pictureMetquay Inc Consulting Team

Confused About Precision, Accuracy, And Resolution in metrology? Here's How To Tell Them Apart

Updated: Mar 16, 2023

Introduction


Precision, accuracy, and resolution are three different concepts in calibration management that are often confused. The difference between them is clear when you understand their meaning.


Precision


In metrology, precision refers to how well your results agree with each other when you repeat them over time or with different people doing them at different times (inter-observer variability). Precision is the closeness of agreement among multiple measurements. It's a measure of how reproducible your measurements are and can be calculated as:

  • Repeatability + Reproducibility = Precision

The precision of a measurement system is an indication of its ability to make repeatable and accurate measurements. It does not matter whether these results agree with some external standard; what matters is whether they agree with each other consistently enough so that if someone gets slightly different results from yours because they used slightly different equipment or did things differently somehow then those differences will cancel out when comparing your results against each other rather than adding up together (intra-observer variability).


For example, if you weigh yourself with a scale and get one reading of 150 pounds on Monday morning, then another reading of 152 pounds on Tuesday afternoon; this would be an example of poor precision because there was not much agreement between your two weights.


We calculate the standard deviation to evaluate the precision.


Accuracy


Accuracy is the closeness of agreement between a measurement and the true value of a quantity being measured. In calibration management accuracy is the first step in determining whether a measurement is good or bad, as it indicates how close a measurement is to its actual value. If you're measuring something with an inaccurate tool, then your results will be inaccurate too!

Accuracy can be defined as: "the degree to which the result of an observation corresponds with its true value". Accuracy depends on several factors including:

  • The precision with which you take your measurements (i.e., how consistently they're repeated)

  • The accuracy of your measuring instrument (i.e., how well-calibrated it is).

To evaluate the accuracy, the measured value must be compared to the accepted value. After we calculate the percentage error to evaluate the accuracy


(|your value - true value|/ accepted value) * 100 = %error


Resolution


Resolution is the smallest change in a quantity that can be detected by an instrument. It is often expressed as the ratio of the minimum resolvable distance to its least count value.


In any measurement, all digits except the last are called certain numbers. The last digit is an estimate and can vary, this is called an uncertain number. When making measurements, you should record all certain numbers plus the first uncertain number. Generally, the resolution of a device indicates the decimal place in which your uncertain number goes.


The easiest way to determine the uncertain place is to identify the smallest increment on the device and divide it by two.


Let's take the example of measuring with a metric ruler,




The number markings in the ruler indicate a centimeter viz 1cm, 2 cm, 3 cm, etc. There are ten divisions between each centimeter marking which means the smaller increment is 1/10 of a centimeter ie 0.1 cm. so the resolution is 0.1/ 2 = 0.05cm.


Now let's look at our measurement, when I look at the ruler marking I observe the reading as 3.4 cm but since our resolution is 0.05cm, we indicate the measurement as 3.40 (we put an uncertain digit at the 10th place) which means our actual measurement could be 3.40 cm / 3.39 cm / 3.41cm


Conclusion


Precision is the closeness of agreement among multiple measurements. Accuracy is the closeness of agreement between a measurement and the true value of a quantity being measured. Resolution is the smallest change in a quantity that can be detected by an instrument, i.e., how finely one can measure something.

bottom of page