Login
Guest Posts
Your Position: Home - Machinery - How do you determine the accuracy of a measuring device?

How do you determine the accuracy of a measuring device?

Feb. 04, 2024

$\begingroup$

Your approach is broadly correct.

If you are only interested in the accuracy of your system you probably want to use something like the maximum error. Your accuracy is then +/- Max error with the assumption that real errors are uniformly distributed within this range (a uniform distribution will often be an overestimation but is a simple option when no better information is available).

However, this approach will often produce large errors due to systematic effects which can easily be corrected by fitting a curve (normally linear) through the plot of measured and true values.

This should correct for the bias in your instrument and you can then calculate the uncertainty based on the standard deviation of the residuals. The total uncertainty is normally a multiple of $\sigma$, the choice is fairly arbitary, so you should state the multiple (k value), or the associate coverage factor.. You should also state what distribution you are assuming as this will effect what multiple gives a specific coverage. E.g. For a Gaussian 95 % coverage k~2, but for a uniform distribution 95 % coverage k~1.68

In addition to accuracy, precision is essential to ensuring each measurement is correct every time so there are no inconsistencies in performance or mixture results. Precision is determined by standard deviation, which is how much and how often measurements differ from one another. If a standard deviation is high, then it suggests low precision. On the other hand, if a standard deviation is low, then it suggests high precision.

How do you determine the accuracy of a measuring device?

How To Ensure Accuracy and Precision in Measurement

Comments

* 0 of 2000 characters used

All Comments (0)
Get in Touch