Login
Guest Posts
Your Position: Home - Machinery - What is the required accuracy of measuring equipment?

What is the required accuracy of measuring equipment?

Feb. 04, 2024

There are several terms often used when referring to characteristics of measuring instruments: resolution, accuracy, and precision. These terms describe how sensitive instruments are to measured quantities, how bias affects the measurements, and how repeatable an instrument is when measuring.

Resolution

Resolution is the measure to which an instrument can sense the variation of a quantity to be measured. It is the maximum incremental change in the instrument’s output with a change in any specified portion of its measuring range. Figure 7.4.1 shows that there is a discrete change in output for a large enough change in the input measuring range.

Accuracy

Accuracy is the condition or quality of conforming exactly to a standard. The accuracy of an instrument is the extent to which the average of many measurements made by the instrument agrees with the true value or standard being measured. The difference between the average and the true value is the error, or inaccuracy. A lack of accuracy is sometimes referred to as bias. When this condition is a result of the measuring instrument, it is known as out of calibration.

A measuring instrument’s accuracy must be considered over the whole range of the measuring instrument. This is often expressed as linearity. Linearity is the maximum deviation of the actual measurements from a defined theoretical straight line characteristic. It is expressed as a percentage of the theoretical output and measured output over the total theoretical output characteristic. The ratio can be expressed as follows:

where: θ is the theoretical output 

E is the measured output 

θT is the total theoretical output

Often the linearity of an instrument is expressed in terms of nonlinearity (1 – linearity ratio). Nonlinearity can be expressed as a percentage of deviation from the theoretical output and measured output over the total theoretical output.

Figure 7.4.2 illustrates graphically the concept of linearity.

Precision

Precision (also known as repeatability) is the variation in readings obtained when repeating exactly the same measurement. The precision of an instrument is the ability to repeat a series of measurements on the same piece and obtain the same results for each measured value. The variation in measured values can be expressed in terms of a standard deviation of the measuring error. The smaller the standard deviation, the more precise the instrument.

Accuracy Versus Precision 

Confusion often exists between the terms accuracy and precision. The confusion exists because the terms are often interchanged in their usage. Accuracy and precision are two different concepts. The accuracy of an instrument can be improved by recalibrating to reduce its error, but recalibration generally does not improve the instrument’s precision. The difference between the two terms will be further clarified in the following examples.

Figure 7.4.3 represents a set of 28 measurements made with the same instrument on the same part, and shows good accuracy with little precision. The accuracy is represented by the small difference (error) between the true value of 0.110 and the average of the measurements of 0.111, which is 0.001. The precision in this case is poor because of the wide distribution of measurements (ranging from 0.107 to 0.115), as shown by the bar graph (each box represents a measurement). This variation can be expressed in terms of a large standard deviation of the measurements error.

Figure 7.4.4 shows 28 measurements taken with a different instrument on the same part, as in Figure 7.4.3. It shows that there is precision, or good repeatability, but that the accuracy is poor. The precision can be seen in the diagram by noting that the distribution of the measurements (ranging from 0.114 to 0.116) is closely grouped around the average (0.115) of the measurements. The standard deviation of the measurements is small in this case. The large error between the true value (0.110) and the average (0.115) of the measurements is 0.005 and represents poor accuracy.

Figure 7.4.5 shows 28 measurements taken with a different instrument on the same part, as in the two previous examples. It shows that the precision, or repeatability, is good, as is the accuracy. Figure 7.4.5 shows that the true value (0.110) and the average value of the measurements (0.110) are the same, indicating that the accuracy is very good. It can also be noted that the variation of the measurements is quite small (ranging from 0.109 to 0.111), which indicates precision or good repeatability.

Stability

Stability refers to the difference in the average of at least two sets of measurements obtained with the same measuring device on the same parts taken at different times. See Figure 7.4.6.

PRECISION METROLOGY

 

 

 

Introduction

 

Metrology is the scientific study of measurement.  One cannot embark on the pursuit of precision manufacturing without an equally passionate journey into the challenges (and perils!) of precision metrology.  So this document is intended to provide a brief introduction to and overview of this complex subject.  Here is a printed version of these notes.

 

 

 

Precision Metrology is Hard Work!

 

The sooner you accept the wise words of Israelle Widjaja, that “properly measuring things is hard,” the sooner you’ll begin to understand how to make accurate and precise measurements.

 

 

 

Rule of Ten

 

The Rule of Ten (or Rule of One to Ten) states the discrimination (resolution) of the measuring instrument should divide the tolerance of the characteristic to be measured into ten parts.  In other words, the gage or measuring instrument should be at least 10 times as accurate as the characteristic to be measured.  Many believe that this only applies to the instruments used to calibrate a gage or measuring instrument when in reality it applies to the choice of instrument for any precision measuring activity.  The whole idea here is to choose an instrument that is capable of detecting the amount of variation present in a given characteristic (i.e. part feature). 

 

To achieve reliable measurements, the instrument needs to be accurate enough to accept all good parts and reject all bad ones.  Conversely the gage should not reject good parts nor accept bad ones.  The real problem arises when an instrument is used that is only accurate enough to measure in thousandths and accepts parts based upon that result and the customer uses gages that discriminate to ten-thousandths and rejects parts sent to them for being 0.0008” over or under the specification limit.

 

Practically speaking, this means to reliably measure a part feature specified as +/- 0.0005” requires a measurement tool with a resolution and an accuracy of 0.0001”.

 

 

 

Accuracy, Precision, and Reproducibility

 

Accuracy refers to how close a measurement is to a true (actual) value or a value accepted as being true.

 

Precision is a measure of the spread of different readings (i.e. repeatability), and is completely unrelated to accuracy.

 

Reproducibility is the degree to which a measurement can be reproduced or replicated by someone else working independently.

 

 

Got Calibration?

 

A measuring instrument is useless if not calibrated regularly against a reliably calibrated gage.

 

 

 

Constant Force

 

A measuring instrument which offers no constant contact force method of measurement can never have the same level repeatability or reproducibility as one that does.  In addition, a measuring instrument that does provide constant contact force only works properly if the clutch or ratchet is rotated at consistent velocity, so technique still matters.

 

 

 

NTP

 

Proper measurements should always be conducted as close to NTP (normal temp and pressure) as possible (68°F & 1atm (14.7 psia)).

 

 

Be Careful!

Additional resources:
Enhancing Efficiency and Precision: The Advantages of CNC Pipe Threading Lathes
Understanding the Pricing Factors of Portable Pipe Cutting Machines
Unlocking Precision and Efficiency: How Fiber Laser Cutting Machines Work
Mushroom Grinding Machine: Revolutionizing the Processing Industry
Fixed Towing Cableways as a Transport Solution
How Do I Choose the Right Low-Pressure Injection Molding Machine for My Needs?
Air-Cooled Screw Condensing Units

 

Whenever possible, measure in an environment that will not damage the part or measuring instrument if either is dropped.

 

Never touch precision ground surfaces (i.e. gage blocks, gage pins, calibration rings, precision measuring surfaces, etc.) with your bare hands, as doing so will cause them to rapidly corrode, ruining their accuracy.  Always wear gloves, remove any anti-corrosion protectant with WD-40 and a new blue shop towel, and reapply anti-corrosion protectant (LPS) after use.

 

 

Never force any measurement instrument.  If a caliper or micrometer won’t move freely, investigate why; most have a locking screw or cam, so check that it’s not tight before damaging the instrument.

 

 

Cleanliness is Key

 

Clean the contact jaws or tips with alcohol and a piece of tissue paper or a blue shop towel before use.

 

 

Got Zero?

 

Always remember to double check the zero of the measurement instrument before use.  This seems fundamental, but it’s surprisingly easy to overlook when paying attention to so many other things.  This means you will need to have calibration gages or standards for instruments which are not self-zeroing (like a 0-1” micrometers).

 

  

 

Thermal Growth

 

Understand metals have a typical coefficient of linear expansion of 0.000010 in / (in-°F); therefore holding on to a measuring instrument and/or a part long enough will cause a 4″ nominal part to change length 0.0012″ due to temperature change alone (0.000010 in / (in-°F) x (4 in) x (30 °F) ≈ 0.0012 in)!

 

For this reason you should always (well, whenever practically possible) use an indicator stand to hold a precision measuring instrument and protect it from thermal growth due to body temperature.  In addition, you should always allow adequate time for the part(s) being measured to reach NPT.

 

 

Multiple Measurements

 

Always take at least three measurements to be “carelessly certain” of the ballpark value.  The deviation between these measurements should match the confidence you are seeking for the repeatability of your measurements.

 

 

Gage Blocks and Gage Pins

 

Become proficient with gage blocks and gage pins, as these are typically manufactured to ±0.000100″ or ±0.000050″ (depending on their grade rating), and are good for moderate precision calibrations.

 

When using them, always wear gloves, work over a safe surface in case you accidentally drop one (never over the open box!), and coat them with rust inhibitor (LPS) when finished.

 

 

 

LEFT: Instructions on how to use gage blocks (click image for video).  RIGHT: Use and care of gage blocks (click image for link).

 

 

LEFT: Applications of gage pins (click image for video).  RIGHT: Example of gage pin set.

 

 

LEFT and RIGHT: Using gage blocks to calibrate a micrometer and bore gage.

 

 

Abbé and Parallax Errors

 

Research Abbé error and parallax error to understand why calipers are not regarded very highly in metrology circles J.

 

Abbé principle states: “Only when datum lines of measuring system and measured workpiece are on the same line, is a measurement most accurate.”  As drawing shows, when there is distance (h) between measuring faces and reading axis line, there will be measuring error (ε = b-a = h tan θ).  Therefore, measuring force and tool distortion must be taken into accounted during such measurement.  Think about what happens when the jaws of a dial caliper are zeroed by bringing their flat surfaces into contact, and then a measurement is made without the jaws in flat contact against the artifact.

 

 

 

LEFT: Proper method of calibration using a length standard; RIGHT: Additional Abbé error introduced because of location of applied measurement force.

 

 

Parallax error is a perceived shift in an object’s position as it is viewed from different angles, and it is inherent in virtually every analog measurement. 

 

 

Parallax error when reading a linear scale, as on a caliper (left) and when reading a vernier dial, as on a micrometer (right).

 

 

Indicators

 

Since I already have a document on indicators, I will simply include the link here.

 

What is the required accuracy of measuring equipment?

Metrology

Additional resources:
What Is an Excavator Vs Backhoe?
How Does an Induction Brazing Machine Work?
What is Common Blow Molding Problems and Solutions?
Why is a Thermal Inkjet Printer a Popular Choice for Small Businesses and Home Offices?
The Advantages of Double Head CNC Lathes
Outranking the Article on China Valve: Enhancing Valve Quality and Reliability
How to use the pipe tapering machine?

Comments

* 0 of 2000 characters used

All Comments (0)
Related Articles
Get in Touch