Skip to content

Measurement Accuracy: What You Need to Know

Michael Anderson
By Michael C. Anderson Contributing Lead Editor

A look at the evolving meaning of measurement accuracy, and how published accuracy standards can help you evaluate metrology devices—but only to a point.

Measurement Accuracy Lead 768x432.jpg
The video-based Starrett AV350 CNC vision system features 12:1 zoom optics.

It’s an old challenge: You’re a manufacturer whose customer needs you to assure that the part you’ve contracted to make for them will be held to specified tolerances. So, what’s the best method for making sure the part is within spec? The question is not only how to measure the part—a CMM touch probe, or a laser scanner?—but also how to evaluate measurement quality. To find the most accurate way to measure your part, you need to understand the uncertainty inherent in any given measurement system.

Fortunately, there are national and international organizations that develop standards to help codify measurement uncertainty levels. The American Society of Mechanical Engineers (ASME), New York, and the International Organization for Standardization (ISO), Geneva, Switzerland, for example, offer standards to communicate accuracy levels for various metrology methods. They are an important resource—but, as we’ll see, they alone can’t tell you what method is best for a particular task.

So, what’s a manufacturer to do?

Accuracy and Print Tolerance

If you’re a novice, first make sure you understand some basic terms. For example, Tim Cucchi, product manager, precision hand tools at the L.S. Starrett Co., Athol, Mass., warned not to confuse accuracy with resolution.

“In industrial instrumentation, accuracy is the measurement tolerance of the instrument. It defines the limits of the errors made when the instrument is used in normal operating conditions. Resolution is simply how fine the measuring instrument is set to read out—whether to tenths, hundreds, thousands or whatever.”

The distinction matters. You’d trust a hardware store yardstick to measure and cut a fence post, but not to check a precision aerospace or medical component—and that’s so even if the yardstick had hashmarks to 1 µm apart. Its resolution wouldn’t really reflect its accuracy.

Measurement Accuracy 2.jpg
A student manipulates a CMM controller to move a touch probe at the Center for Precision Metrology at The University of North Carolina at Charlotte.

Print tolerance refers to the amount of dimension deviation allowable on a part as defined by the customer’s blueprints or specifications. Cucchi pointed out that print tolerances are not concerned with what metrology method one uses to meet them but only with the requirements of the part. It’s up to the manufacturer to find a reliable method to check the part for accuracy.

Calibrating Uncertainty

“Measurement professionals are aware that there is always error in measurement,” noted Gene Hancz, CMM product specialist, Mitutoyo America Corp., Aurora, Illinois. “It is therefore critical to define what ‘sufficiently good’ measurement quality means.”

It’s a definition that engineers have articulated in different ways over time, Hancz said. The question has been, what level of accuracy is needed for the measurement to be trusted?

“In 1950, a U.S. Military Standard, MIL-STD-120 Gage Inspection, was released, which stated that when parts were being measured the accuracy tolerances of the measuring equipment should not exceed 10 percent of the tolerances of the parts being checked,” Hancz said. So if a part’s print tolerance is, say, to a centimeter, then the measurement system needs to be accurate to a tenth of that, or a millimeter. “This rule is often called the 10:1 rule, or the Gagemaker’s Rule,” he added.

So how do you ensure that the system used to measure that part is accurate to that necessary one tenth of the print tolerance—in this case, 1 mm? According to that same standard, you calibrate it to an even finer 0.2 mm: MIL-STD-120 stated that the accuracy of the measurement standards used for calibrating the measurement equipment itself shouldn’t exceed 20 percent of the tolerances of the measuring equipment being calibrated, or 5:1, according to Hancz.

Measurement Accuracy 3.jpg
Hexagon RS-SQUARED 3D white light scanner uses an articulating, seven-axis Absolute Arm to position large square “tiles” of 3D scan data. Up to four data tiles are captured every second.

“Both of these rules have transformed over the years into what is often called the TAR, or test accuracy ratio, and the past requirements of 10:1 or 5:1 are now typically stated as a 4:1 requirement, or 25 percent of tolerance.

“The evaluation of measurement uncertainty stormed into commercial calibration practice in the late 1990s,” Hancz continued. “As more and more calibration laboratories started calculating and documenting uncertainty, both in scopes of accreditation and in calibration certificates, the practice of using TAR calculations began to be replaced with the test uncertainty ratio, TUR.”

What’s the difference? “Measurement uncertainty includes all sources of variation, not just the specified accuracy of the measuring equipment,” Hancz said. TUR is calculated by dividing the plus/minus tolerance being checked by the plus/minus measurement uncertainty.

Manufacturers sometimes have it relatively easy: a customer will give them not only a print tolerance but also tell them that, following an internal or published standard, the measuring equipment must meet a certain accuracy specification. The supplier’s job is to make sure their measurement system meets the requirement.

But other times the supplier has little or no guidance from the customer and must decide on their own how much accuracy is necessary. David Wick, manager-product management, Zeiss Industrial Quality Solutions, Maple Grove, Minn., said that “whether it’s four times, five times, or 10 times, what you pick is influenced by the degree of confidence that you need in the measurement.”

For example, say you’re measuring the most critical tolerance on an automotive engine block and this tolerance drives the performance of the engine. “In that case, you’d better be very sure you’re measuring it is as well as you can afford to measure it,” said Wick.

In other words, don’t skimp—even if it means using a slower and costlier measurement system than you’d like. You can’t afford not to.

On the other hand, Wick pointed out, metrology is never a one-size-fits-all process. You may well be able to use a faster, less costly method for parts with less critical tolerances.

“You might need high tolerance in a jet engine turbine, making sure the blades have the correct twist and airflow, but less for the sheet metal panels on the plane’s wing where it generally doesn’t matter if that’s off by a micron.”

Standardized Tests

Once a manufacturer understands the tolerance levels it needs to meet, its quest is to find a measurement system that delivers the accuracy needed. The good news is almost every reputable metrology equipment maker will ensure that its equipment conforms to accuracy requirements specified by either ASME in its B89 standards or ISO in its 10360 standards.

There are other measurement standards out there, including CMMA, VDI/VDE2617, and JIS, but they aren’t as widely used as the ASME and ISO standards. And of the two, the international ISO standard is the most widely used. In fact, ASME has been taking steps to bring its standard B89 series in line with the 10360 series. For example, its description of its B89.4.10360.2 – 2008 standard for evaluating CMM linear measurements notes that “it was created to harmonize the B89.4.1 standard with ISO 10360.2 by incorporating the entire 10360.2 document into it.”

Both the ASME B89 and the ISO 10360 are a series of standards for the accuracy testing and performance verification of various coordinate measuring systems. When CMM makers document that their systems meet these standards, customers can directly compare the accuracies of each system.

The standards are decided upon with input from metrology experts in government, academia and industry, most certainly including the metrology OEMs themselves. Standards are continually under review and are updated as technology and use cases change. And as new methods of measurement come in, new subcategories are created to guide users’ expectations, noted Zeiss’s Wick.

Measurement Accuracy 4.jpg
Scan display on a Zeiss METROTOM 1500 CT scanner used to check die-cast and injection-molded parts at TCG Unitech in Austria.

“We make a wide range of measurement equipment, and each is made to conform to a specific 10360 subcategory,” he said. So, for example, on the company’s traditional CMM, length measurement and repeatability range conform with 10360-2:2009; scanning error conforms to 10360-4:2000; multi-stylus form, dimension, and location probing conform to 10360-5:2010; and its multi-sensor CMM, optical comparators and structured light systems conform to other subcategories.

Even the relatively recent use of computed tomography and X-ray technology for industrial metrology is covered, Wick said. “There are only a few companies that can make metrology-grade measurements on CT machines. Zeiss is one of them. And again, we use the same ISO 10360 standards to express the measurement uncertainty of the results that you get from a CT machine.”

Standard Limits

Standards such as ISO 10360 would seem to be a Rosetta Stone for manufacturers seeking to add or upgrade metrology capabilities as they consider the wide range of equipment available to them. They simply need to limit their selection to systems that conform to it, and from there consider price, measurement speed and so on—right?

Not so fast.

Edward Morse is deputy director of The Center for Precision Metrology at The University of North Carolina at Charlotte (UNCC). He’s also the co-chair of the PrecisionPath Consortium and a longstanding member of the U.S.-based Coordinate Metrology Society. In addition, he is chair of the ASME Standards Committee (B89) on Dimensional Metrology.

“A standard allows metrology OEMs to specify their instruments in a common way,” Morse said. “The user can select an instrument that’s appropriate for their needs.” So, one can check a number of, say, single-probe CMMs and choose one based at least partially on how well it conforms to the standard.

“What gets tricky is when you’re trying to evaluate the accuracy of different types of instruments to perform a particular task,” he said. In other words, the standards are more useful for apples-to-apples comparisons but are problematic for apples-to-bananas.

“On the one hand, imagine you have a CMM that takes one point per second—or, if it’s scanning, maybe many points per second—but nothing on the order of the hundreds of thousands of points that an optical system might obtain. How do you fairly compare these instruments,” Morse asked rhetorically.

Measurement Accuracy 5.jpg
The Starrett HDV300 is a horizontal digital video comparator that combines features of a horizontal optical comparator with a vision metrology system.

A manufacturer could spend an awful lot of time collecting millions of points on the surface of a part with an optical system, for example, but still not be able to measure down inside some of the holes—a trivial task for a touch probe, he noted. “And that kind of difference is not addressed by the standards.”

A related issue: The standards are valuable for how they express and set a value on measurement uncertainty, but the measurements themselves are very narrow and specific. A given CMM must be able to measure a given gage block to a certain level of accuracy in order to comply.

“In the real world, manufacturers are doing more than measuring gage blocks,” Morse said. “A classic example with optical systems is that some don’t measure shiny parts well. They work great during the tests, which has them measuring nice matte surfaces, but then you go to measure a part and the system can’t even see it because there’s too much reflectivity. Conversely, tactile CMMs are not well suited for soft or delicate parts.”

The upshot is that “The measurement system’s compliance to the standard is useful, but only for how well it performs the specific test that is described by the standard, be it 10360 or some other test,” he said. “And that may not be directly related to how well it can measure your parts.”

Another issue to keep in mind is how long it takes for a standard for a newer measurement technology to be developed and released, said Joel Martin, laser trackers and optical scanners product manager, Hexagon Manufacturing Intelligence, North Kingstown, R.I.

“For example, the ISO standard for laser trackers, 10360-10, is just now being ratified—some 30 plus years after the introduction of the technology,” he said. The reason it has taken so long is that the standard must incorporate the use cases of the separate developers of the technology, according to Martin.

“Hexagon, with its more than 30 years of experience developing and testing laser trackers, wasn’t ready to support the initial draft of the ASME B89.4.19-2006 standard, which preceded10360-10, because it didn’t reflect what we’d determined in our own labs what a laser tracker must be able to do,” he said. Other makers of these systems also had their own perspectives on what the standard should be.

“It takes time to get an agreed-upon set of features into the standards,” said Martin. “That’s why the CMM standard is as detailed as it is today—it took 50 years to build the standard to be where every manufacturer looks at it and says, ‘Yeah, we’re good with this.’”

Finally, standards such as ISO 10360 and ASME B89 codify only one thing: accuracy. They aren’t designed to tell you anything useful about a given measuring system beyond that. If a manufacturer is interested in knowing about a system’s speed, flexibility, traceability, readiness for Industry 4.0, or how it contributes to minimizing the total cost of manufacturing, the standards are silent.

Discoverable Metrology

All of this might sound a bit grim for the manufacturer trying to understand this modern world of metrology. There is good news, however. It’s worth pointing out that, in general, metrology systems have never been more powerful and, at the same time, easier to understand and use—even for the novice.

Part of this is a reflection of metrology’s migration from a separate quality-control lab to the point of production, according to Mark Arenal, general manager of Starrett’s Metrology Division.

“Before, when a part would come off of, say, a machine tool, an operator would run it into the QC lab and say, ‘I need a first article inspection.’ The QC lab specialists would say, ‘Put it on that shelf and check back in a couple of days—we’re backed up.’ And the production process would stall. To minimize the time length of that process, some inspection equipment is now right out on the shop floor,” he said.

But the metrology equipment needs to be optimized for use in that environment. That means not only making it more robust and dust-proof, but also usable by workers on the floor rather than only by experienced, full-time metrologists. “At Starrett, we use the term, ‘walk-up metrology,’” he said.

“Measuring instrument manufacturers are being challenged to make our systems easier to use,” he said. “We’re making user interface software that is discoverable—easy to navigate the way people already use their phones and tablets. It needs to be touchscreen-enabled and icon-driven, with pop-up help screens. Somebody who’s not experienced in the metrology world should be able to quickly learn how to make simple, accurate measurements.”

But, Arenal said, functionality doesn’t have to be sacrificed for ease of use. “The systems still have real power behind them, so that if a user needs to do a completely automated part inspection program on a part that has a hundred features on it, they can do that too.”

Collaborate with Experts

There’s more good news. There are lots of resources for metrology novices. They include—as UNCC’s Morse pointed out—those offered by the Coordinate Metrology Society, which include online video training, a certification program, an annual conference, and an online library of technical papers. And, of course, many community colleges, trade schools and universities have programs that cover metrology to varying degrees.

But when manufacturers need to find the optimal measurement method and system for a specific new task—and quickly—they’ll need to talk to experts. They’ll find that reputable metrology equipment makers will help them find the best solution—and it’s in the latter’s interests to do so, even if that solution doesn’t mean a sale.

“We build a collaborative consulting relationship with our clients,” said Hexagon’s Martin. “We ask, ‘what does the widget you’re making look like? What are your production processes? What is your throughput requirement?’ Working with them, we figure out optimal inspection solutions.”

Metrology equipment developers that offer a wide range of measurement methods and solutions have less motivation to push a potential customer toward a method that isn’t ideal just because it’s in inventory. But Martin thinks that reputable makers of metrology equipment that focus on even a single method will still do right by an inquiring potential customer. It isn’t in anyone’s long-term interest to sell the wrong system to a customer, he said.

“In our world, if someone makes the measurement equivalent of a hammer and you show them a screw, they’ll explain that ‘while we could sell you something that could pound the screw into the board, what you really need is a screwdriver. Go talk to these other guys,’” Martin stated.

  • View All Articles
  • Connect With Us
    TwitterFacebookLinkedInYouTube

Related Articles

Always Stay Informed

Receive the latest manufacturing news and technical information by subscribing to our monthly and quarterly magazines, weekly and monthly eNewsletters, and podcast channel.