Accuracy of Measurement Equipment

Automotive inspection, TS 16949, IATF 16949

Q: I work for an incoming quality assurance department. In our recent audits, the auditor claimed that high precision machines such as the Coordinate Measuring Machines (CMM) and touchless measurement system should have higher Gage Repeatability and Reproducibility (GR&R) values compared to less precise equipment such as hand-held calipers and gages. If this is the case, does Measurement System Analysis (MSA) cater to this by providing a guidance on what are the recommended values for each measuring equipment by general? If not, should we still stick to the general MSA rules, regardless of the equipment’s precision value?

A: When you noted “higher GR&R values,” that in itself can be a bit confusing because the GR&R value is a percentage of errors caused by repeatability and reproducibility variation. The higher the number, the more variation present — and the worse the measurement method is.

As far as I know, MSA doesn’t give specific guidance for recommended values depending on the measuring equipment. Also, I’m not sure of the validity of saying that a CMM is consistently more accurate than other equipment, such as calipers. Although the equipment may theoretically be more accurate, how you stage the part to be measured will also affect the amount of variability, as will the feature being measured.  Consequently, even though the CMM is theoretically more accurate, there may be 20 percent GR&R, mainly due to the holding fixture or the feature being measured. I’m sure you get the point here.

As far as I know, MSA manuals do discuss what the major inputs should be when deciding the amount of acceptable variation. It strongly recommends to look at each application individually to verify what is required and how the measurement is going to be used.

Another thing to consider is whether you are looking at the GR&R based on total variation or on the specified tolerance. Tolerance-based is more commonly used than total variation, but that may depend on the type of industry.

One thing I would like to mention is that if you have three people take 10 measurements each, and then dump the information into one of the common software programs, it will not matter if they take the 10 measurements with a dial caliper or with a CMM. The instruments’ “accuracy” should not be the deciding factor, but the tolerance base should be.

Also, ISO standards do not dictate GR&R values. If you do what your quality management system says you do, most auditors will not push such an issue. While some auditors may offer “opinions” and suggestions, such items are rarely cause for nonconformance findings.

I hope this helps answer your question.

Bud Salsbury
ASQ Senior Member, CQT, CQI

For more on this topic, please visit ASQ’s website.