Skip to content

Ensuring Measurement Accuracy Using Calipers

Aaron Burnett
By Aaron Burnett Quality and Manufacturing Engineer
blade-light-test.jpg
Blade light test (All images provided by Aaron Burnett)

Measurements have been used to define and attempt to control the stuff we make throughout recorded history.

Noah measured using his arm (cubit) and the Greeks and Romans used their feet to conquer and measure the world. Obviously these measurements were flawed because no two arms or feet are exactly the same length.

I’m going to skip over Pierre Vernier, who invented a precise scale. In 1850, Joseph Brown was trying to manufacture precision clocks. He decided to make a device that used the vernier scale to measure the size of clock parts. He may have based his device on a special-use caliper made by a French artillery factory in 1840. He called his invention “Pocket Vernier Caliper.”

Three years later, Brown took a partner, Lucian Sharpe, to handle the business, which was renamed J. R. Brown & Sharpe. Machinists loved the pocket vernier caliper—which is really just a precise scale. Its beefy beam and micro adjuster made this a really great measurement tool.

The vernier caliper was in heavy use until around 1960, when the dial caliper was invented by the Germans. This caliper works with a rack gear on the beam and a pinion gear under the dial. This was the first multiple-measurement caliper. In addition to outside measurements they also measure inside, depth and height. Today’s digital caliper substitutes a magnetic scale on the beam. Everything worked well as long as the people who made manufacturing decisions were proficient with the caliper.

Calipers Today

Sadly, today there is confusion about this gage and measurement uncertainty in general. If you believe calibration solves the problem, you are one of the confused. Here are some helpful rules to help clear up any confusion:

Rule 1: Buy the best you can afford. Good calipers give much, much better measurements than cheap ones.

Rule 2: Clean and lubricate frequently. If they don’t move freely you can’t be accurate. In dirty environments you will need to clean and lube the calipers at the end of each use and before they get put away.

Rule 3: Practice consistent measurement. This is called “feel” and can take months or even years to perfect.

Here are the ways to measure with calipers, along with tips to make life easier.

depth-tip.jpg
Depth tip

Outside Measurement

There are two very different areas for outside measurement: the tips and the anvils (that’s the wide, flat place).

Always clean the anvils and close the caliper before you measure anything. Why? You are looking to make sure you get zero with your feel. You will notice that Brown’s micro adjuster is gone, which does not make the measurement more accurate!

Don’t use the tip measurement if you don’t have to. Why? That area allows for more misalignment and is therefore less accurate. Make sure the calipers are square to the part. By measuring close to the scale you also reduce abbe (sine) error.

Before you use the tip measurement, clean and close the caliper, then hold it up to a strong light. If you can see light between the tips, don’t use the caliper. Remove burrs or dirt. If the problem persists, the caliper will need to be replaced.

Get an object that has a known size that is similar to the object you are trying to measure. Every time you make a measurement, check this object before you make the actual measurement. A pin of known size or a gage block are good choices.

When you measure the known object, don’t look at the reading until the measurement feels right. By doing this you will avoid changing your hand pressure to get the measurement you want. Keep trying until your repeatability is at least one quarter the tolerance you are trying to hold. If you can’t accomplish this, find a more accurate way to measure.

Inside Measurement

Before you use the inside measurement, clean and close the caliper, then hold it up to a strong light. If you can see light between the inside tips, don’t use the caliper. Remove burrs or dirt. If the problem persists, the caliper will need to be replaced.

Don’t try to use the caliper for small hole sizes. Gage pins are more accurate and also give other information such as hole taper and roundness.

Get a ring gage with a similar size to check the dimension. Check your feel with the ring gage until you can be confident of measurement.

Depth Measurement

Before you use the depth measurement, clean and close the caliper to verify zero, then hold it on a known flat surface and push the tip down. If you get a non-zero number, that will be the new zero. Adjust the dial or digital reading.

Height Measurement

Before you use the height measurement, clean and close the caliper to verify zero, then hold it on a flat surface and push the end down. If you get a non-zero number, that will be the new zero. Adjust the dial or digital reading.

Expected Caliper Measurement Accuracy

The biggest mistake you can make is applying broad-brush accuracy to any gage. It’s especially dangerous with the caliper because it is so versatile. Let me demonstrate this using a humble clip. Let’s say the design requires that the clip width be 0.514" (13.056mm) ±0.002" (±0.051 mm). This is twice the gage makers stated accuracy for the caliper of ±0.001" (0.025 mm).

Measuring the same clip multiple times, I get values ranging from 0.516" (13.106 mm) to 0.512" (13.005 mm), so the range of readings on this clip is already using the entire tolerance. Clearly this measurement needs to be improved.

B&S-Caliper.jpg
Joseph Brown's caliper


Improving Caliper Measurement Accuracy

User training is a good place to start. Sometimes the problem is obvious to experienced people.

Try laying the caliper on a flat surface to make the measurement.

You can change the caliper to a fixed gage. To do this follow these steps:

Set the caliper just below the high end of the tolerance and lock the slide.

Lay the caliper on a flat surface.

freehand-clip.jpg
Freehand clip-width measurement

Put the clip between the anvils and find a place where it sticks; if you can’t find a sticky place, the clip is too big.

Set the caliper just above the low end of the tolerance and lock the slide.

Lay the caliper on a flat surface.

Put the clip between the anvils and find a place where it sticks; if you can’t find a sticky place the clip is too small.

You will often need quality assurance buyoff to use this method, because it is a change from variable to attribute gaging. Also, this makes machine adjustments difficult since the true value is not known.

Ask about the tolerance. It is common for designers to use tolerances they don’t understand. In this case the ±0.002 tolerance seems excessive for this item. It’s always worth asking, but get the off-spec in writing, including date and initials.

Look into alternate gages. Micrometers have more accuracy but can cause other problems. For instance, you can’t lay them flat and measure this clip because the barrel is too big. Low-force calipers might help if the clip width changes under pressure.

Digital versions often have data wires or Bluetooth that allow push button measurement. These can give greater accuracy but make sure your quality folks validate digital input, especially if it involves software!

Get expert help. Contact someone who is good at uncertainty budgets and can recommend reasonable gages that can be used during manufacture. I suggest HN Metrology. NIST has a spreadsheet template with instructions.

  • View All Articles
  • Connect With Us
    TwitterFacebookLinkedInYouTube

Always Stay Informed

Receive the latest manufacturing news and technical information by subscribing to our monthly and quarterly magazines, weekly and monthly eNewsletters, and podcast channel.