Accuracy Of Dial Calipers

randyc

Active User
Registered
Joined
Feb 5, 2015
Messages
662
A post on another forum alluded to the inaccuracy of calipers. This was a raging argument six or eight months ago on PM and I put in my two cents there as follows:

"I read negative comments constantly regarding dial calipers and certainly some question exists as to repeatability and accuracy in a dirty environment. But they surely have their place and vernier calipers are reliable and tolerant of some uncleanliness -

The opinion that dial calipers can be trusted only within some arbitrary number (usually much greater than .001) makes me wonder. Just for grins, I went out to the shop and grabbed a $15 set from you know where that I use for WOODWORKING - measuring wood thickness to correct the planer depth setting.

I wiped the jaws clean and then measured a series of gage blocks twice, recording the error as closely as I could interpolate from the dial (yeah I know, this is sort of a visual crap shoot). The first number is the gage block dimension, followed by the caliper measurement:

.0500 : .0503
.1110 : .1110
.1430 : .1433
.2500 : .2502
.4500 : .4500
.6500 : .6502
.9000 : .9002
2.000 : .2001
3.000 : 3.002
4.000 : 4.000

Apparently I could trust the accuracy/repeatability of these particular dial calipers - in a clean environment at a reasonable temperature- to within .0003, following good practice like wiping the jaws and checking zero before making a measurement.

Metrology texts suggest that a measurement instrument should have an accuracy/repeatability of ten times better than the accuracy requirement of the part being measured. This is certainly no problem using micrometers when tolerances are on the order of .001. However it is not unusual to make measurements in the order of .0003 using "tenths" micrometers. This certainly doesn't satisfy the suggested ten times accuracy. Perhaps an "acceptable" scale of accuracy might be three times the tolerances being measured ?

If so, then the above calipers would be acceptable to measure within .001, again assuming good practice. And remember these are cheap $15 imports, not Mitutoyos. And regarding vernier calipers, why would anyone not trust these instruments to be just as accurate as a vernier height gage ?"


FYI: the consensus of the majority of machinists was pretty much in agreement with the above although there were a few that disagreed violently, LOL.

edit to add: I wouldn't trust dial calipers to be accurate when measuring large dimensions in cold temperatures. I don't know the temperature range over which an imported pair of calipers are accurate but I'm OK with using mine from 60 deg F to 80 deg F.
 
Last edited:
Caliper use is always being visciously debated on the internet.

I see two issues with this quote:
-Repeatability and Accuracy are two different things. Fellow didn't state he actually tested repeatability.
-It take an experienced hand to keep the same amount of pressure on the jaws.
 
I belive that calipers on only as good as the person using them. I worken with an oldtimer that verry rarley used a mic. and did some of the best work. And then there was some that couldn't messure anything with calipers.
 
The "10 times" metrology rule assumes that you are making a part that has to work with one being made by a guy in California and that only thing your instruments and his have in common is the NIST traceability of their calibration.
 
Caliper use is always being visciously debated on the internet.

I see two issues with this quote:
-Repeatability and Accuracy are two different things. Fellow didn't state he actually tested repeatability.
-It take an experienced hand to keep the same amount of pressure on the jaws.

Howdy, the fellow was me and as I said, I made two measurements on each gauge block. Granted, that's not a lot of measurements to confirm repeatability but what the heck, these were $15 calipers and the two measurements agreed in every case.

I don't think pressure on the jaws is at all critical. Using the same $15 calipers and a one inch gauge block, I just made the following experiment. From barely snugging the caliper jaws (on the gauge block) to exerting a considerable amount of pressure, I note a variation of about .0005 on the dial.

Not trying to argue with you but you might try the same experiment yourself - it only takes a few seconds you might be surprised at how good your calipers are :)
 
The "10 times" metrology rule assumes that you are making a part that has to work with one being made by a guy in California and that only thing your instruments and his have in common is the NIST traceability of their calibration.

Yep - nobody could dispute that the "rule" would guarantee interchangability because the point of having measuring tools that are capable of ten times the accuracy of the part to be measured is to reduce measurement uncertainty.

The "go to" guy for this kind of discussion is Gordon B. Clarke on "Practical Machinist". He's spent his entire career dealing with the nit-picky topics of metrology and is a wealth of information.
 
Last edited:
A post on another forum alluded to the inaccuracy of calipers. This was a raging argument six or eight months ago on PM and I put in my two cents there as follows:

"I read negative comments constantly regarding dial calipers and certainly some question exists as to repeatability and accuracy in a dirty environment. But they surely have their place and vernier calipers are reliable and tolerant of some uncleanliness -

The opinion that dial calipers can be trusted only within some arbitrary number (usually much greater than .001) makes me wonder. Just for grins, I went out to the shop and grabbed a $15 set from you know where that I use for WOODWORKING - measuring wood thickness to correct the planer depth setting.

I wiped the jaws clean and then measured a series of gage blocks twice, recording the error as closely as I could interpolate from the dial (yeah I know, this is sort of a visual crap shoot). The first number is the gage block dimension, followed by the caliper measurement:

.0500 : .0503
.1110 : .1110
.1430 : .1433
.2500 : .2502
.4500 : .4500
.6500 : .6502
.9000 : .9002
2.000 : .2001
3.000 : 3.002
4.000 : 4.000

Apparently I could trust the accuracy/repeatability of these particular dial calipers - in a clean environment at a reasonable temperature- to within .0003, following good practice like wiping the jaws and checking zero before making a measurement.

Metrology texts suggest that a measurement instrument should have an accuracy/repeatability of ten times better than the accuracy requirement of the part being measured. This is certainly no problem using micrometers when tolerances are on the order of .001. However it is not unusual to make measurements in the order of .0003 using "tenths" micrometers. This certainly doesn't satisfy the suggested ten times accuracy. Perhaps an "acceptable" scale of accuracy might be three times the tolerances being measured ?

If so, then the above calipers would be acceptable to measure within .001, again assuming good practice. And remember these are cheap $15 imports, not Mitutoyos. And regarding vernier calipers, why would anyone not trust these instruments to be just as accurate as a vernier height gage ?"


FYI: the consensus of the majority of machinists was pretty much in agreement with the above although there were a few that disagreed violently, LOL.

edit to add: I wouldn't trust dial calipers to be accurate when measuring large dimensions in cold temperatures. I don't know the temperature range over which an imported pair of calipers are accurate but I'm OK with using mine from 60 deg F to 80 deg F.
Are your 2 and 3 inch measurements meant to be .001" and .002" high? If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003". If they are indeed .0001" and .0002" you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value. Not bad for an instrument that is normally only read to the nearest .001".

Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated. i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.

I was not aware that that requirement held for measurements.
 
......The "go to" guy for this kind of discussion is Gordon B. Clarke on "Practical Machinist". He's spent his entire career dealing with the nit-picky topics of metrology and is a wealth of information.

Gordon has a wealth of information on most things. But he's nowhere near the "go to" person on the subject. I respect Gordon for he has done and support him 100%. He has a nice screw measuring system for measuring pitch diameter, but like many, can't write instructions for most of us to understand and use.

BTW: He was a member for a short period before being banned several years back.
 
Are your 2 and 3 inch measurements meant to be .001" and .002" high? If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003". If they are indeed .0001" and .0002" you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value. Not bad for an instrument that is normally only read to the nearest .001".

Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated. i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.

I was not aware that that requirement held for measurements.

A +1 on what you said. Yes, this is more like it. And accuracy is not repeatability. Precision is repeatability…Good Luck, Dave.
 
Are your 2 and 3 inch measurements meant to be .001" and .002" high? If so, I would question the reason why they are so far off nominal when the remainder of the measurements are within .0003". If they are indeed .0001" and .0002" you would have an average error of +.00015" and a standard deviation of.00011. If I remember correctly three standard deviations gives you a confidence level of 99% so you're pretty darn sure your reading is within +.00045/-.00015 of the actual value. Not bad for an instrument that is normally only read to the nearest .001".

Also, I had heard that in calibration of instruments, the calibrating instrument or standard had to have at least a tenfold or better accuracy than the expected accuracy of the instrument being calibrated. i.e. if I were calibrating a micrometer which read to ten thousandths, my gage blocks had to be accurate to within 10 ppm in order to maintain the traceability chain and to certify the micrometer.

I was not aware that that requirement held for measurements.

The 3.000 inch measurement is a typo, one of the zeros took a lunch break. The 2.000 inch measurement - ditto, the decimal point became liberal and migrated to the left. Although it would be natural to question the measurements, I am positive that those measurements are good ones WITHIN the ability of my eyes to interpolate the divisions on the dial (which, as I wrote, is sort of a crap shoot).

I strongly suggest that others make the same measurements to confirm (or not) the surprising results, given the presumed low quality of the tool I was using !

Metrology is not a strong suit for me - at least not mechanical metrology (I'm much better at electronics metrology, specifically RF and Microwave instrumentation). On reflection, although three standard deviations would seem to guarantee a much greater accuracy than either claimed or assumed, I would be uncomfortable with accepting measurements of that precision that as I'm sure most others would be !

Similarly while calibration standards almost always require an order of magnitude greater precision than the device to be measured, when making measurements down to the tenths (as I noted in my .0003 example) I personally would prefer something with that same magnitude of precision to reduce uncertainty when making bearing fits and the like.

Without spending a bundle, however, I'm stuck with tenths micrometers and a set of gauge blocks to compare them against. A sometimes useful technique in the absence of gauge blocks is to use three instruments to make the measurement, discarding the worst measurement of the three. This is simple enough for small dimensions where micrometers are cheap but gets pricey for large dimensions which are the ones that I'm most doubtful about !

A condition that I mentioned but didn't stress highly enough, perhaps, is the temperature environment. Maybe my instrumentation (micrometers) is not as stable as could be but I have to recalibrate if shop temperatures change more than 15 or 20 degrees F. Example: A micrometer calibrated to a stainless steel standard measures an aluminum workpiece at 1.7500 inches in diameter when shop temperature is 60 degrees F. When the same workpiece reaches a temperature of 80 degrees, it now measures 1.7502 inches.

Might be trivial depends on the application -
 
Last edited:
Back
Top