By Dr Scott Rudge
Editor's Note: Interested reader Ray Nims points out that I forgot to square the terms in my propagation of error calculation below. I apologize for the misinformation. The corrected blog follows:
In calibration, there is a lot of focus on using the right standard. Standards must be NIST traceable, have current calibration certifications, and have been cared for appropriately. See for example ANSI Z540.1 or ISO 17025. Less attention is paid to the appropriate accuracy of the standard. In this short blog, we will discuss the basis for instrument range, instrument tolerance and standard tolerance.
In calibration, there is a lot of focus on using the right standard. Standards must be NIST traceable, have current calibration certifications, and have been cared for appropriately. See for example ANSI Z540.1 or ISO 17025. Less attention is paid to the appropriate accuracy of the standard. In this short blog, we will discuss the basis for instrument range, instrument tolerance and standard tolerance.
The tolerance of a process for variation in a certain
parameter should be set in process development.
Ideally, this is done as part of process characterization, where the
effect of parameter variation is measured.
Some examples of operating ranges set in process development are
temperature ± 2°C, pH ± 0.2 units and conductivity ± 10 mS/cm. Ranges this tight put some pressure on
calibration to be especially accurate. After
all, if your instrument is reporting a measurement right at the limit of the
acceptable range, it’s probably very important that the instrument not be
inaccurate by very much, if at all.
The National Conference of Standards Laboratories (now known
as NCSL International) recommends that instruments be calibrated with an uncertainty
of no more than 25% of the acceptable control range. This means that the tolerance (or
uncertainty) for the instruments measuring temperature, pH and conductivity
cited above would be ± 0.5°C, ± 0.05 and ± 2.5 mS/cm, respectively. These are tight tolerances, but are needed to
ensure that the process is really within ± 125% of the target range.
But wait, there’s more!
You can only be sure that the measuring instrument is within tolerance
if you know that the uncertainty of the standard used to calibrate it. Even standards are not necessarily 100.00%
accurate, are they? Standards used for
calibration also have a known uncertainty, and again NCSL International
recommends at least a 4:1 ratio of standard to instrument uncertainty. So the standard used for the instruments
above should have a tolerance of no more than ± 0.125°C, ±0.0125 and ± 0.5125
mS/cm, or 6.25% of the operating range. This uncertainty also carries through
to the uncertainty of the measurement.
Taken absolutely, the widest range possible range is ± 131% of the
actual or target range. However, it
should not be assumed that uncertainty randomly falls to the extremes of the allowable
ranges. It is more common to perform a
propagation of errors calculation. Here,
the squares of the errors are added, and the square root of the sum calculated,
as shown below
This gives a more likely range of ±126%.
There is a bottom up approach sometimes taken in determining
calibration tolerances. In this
approach, the capability of the instrument is used to determine the calibration
tolerance. In other words, if the thermometer
is claimed to have an accuracy of ± 0.1°C by the manufacturer, then it should
be calibrated with a temperature standard with an uncertainty of no more than ±
0.025°C. This approach has proven
increasingly difficult, as modern technology has increased the capabilities of
field measurement instruments. However,
the approach is valid, and some leeway exists (up to a tolerance ratio of
1:1).
Of course, higher tolerance ratios are permitted. But these general guidelines should help you
design your calibration program so that you know you are making quality
measurements in your process.