Accurately measuring and boring holes (28mm)

Thanks for all the input! Played around quite a bit and getting ever so slightly better at this..
Things I've noticed:
-My digital micrometer is very difficult to get precise measurements with, measuring the same standard it will sometimes show 25.000 directly, sometimes start at 25.050-ish and require a lot of wiggling around to get back down to 25.000 etc. Maybe dust/grime but I try to keep it as clean as possible. I always use the torque thing to make sure it's not over-tightened. If I'm very careful this is however the most accurate for outside dimensions right now.
-My two analog inside micrometers are both ... Not great. Even using their torque function you can tell they're overtorqueing. Furthermore they both are out of calibration. Properly seating my digital micrometer against my 25mm standard, locking it and trying to measure the micrometer with the two internal micrometers gives me -0.08 and +0.06 on the two analog mics, where the smaller of them also has a big flex problem and the second has way too much wiggleroom in the jaws(they can rotate several degrees to the side).
-The bore gauge is insanely difficult to use despite being easy in theory. I've carefully set it and rechecked several time against the digital micrometer and it stays "calibrated" HOWEVER when I try to measure something I've bored it's both very sensitive to the surface finish and also tends to give faulty readings. I have an endmill that is confirmed to be 24.91mm but I absolutely can not replicate that measurement with the bore gauge.
-The bearings were actually 27.9mm and not 28mm outer diameter, no wonder ~27.95 resulted in a slide-fit.. I had just assumed they'd be 28.00.

I've bored maybe 15-20 different sizes of holes and kept trying all different kinds of measuring tools but so far none of them feels reliable to me when looking for 0.01 mm accuracy.

Also just to clarify, I can hit the same 0.01mm dimension with my machine, it's measuring exactly what the dimension actually is that is a problem.
ie I can hit 24.95 on my digital micrometer all day long but it often feels like the 24.95 is a faulty measurement despite measuring the same way with the same force, temp etc.

Next time I need to experiment more with the telescopic gauges and see if that's more repeatable for me.

Also the bearings are for a sanding belt roller. ~51mm diameter and <35m/s surface speed so around 7500-12000RPM with a fairly light load, I don't expect to see much more than ~50-100N.
 
Last edited:
That's the style we had at work. I think the largest was up to 8". They were ungodly expensive way back then and haven't come down in price since then.
I was lucky enough to score this set of SPI 3-Point Digital Mic's last year at a fantastic price (I'm not saying how much!):

SPI 3-Point Hole Mics 0_8 - 2_0.jpeg
0.8" — 2.0", including standards & extensions (I actually wish they were manual so I wouldn't have to worry about batteries). I've checked them all and actually used them a couple of times. Much nicer than the internal mic's:

0_2 -1_2 n 1 - 2 Inside Mics.jpeg
 
The bearings were actually 27.9mm and not 28mm outer diameter, no wonder ~27.95 resulted in a slide-fit.. I had just assumed they'd be 28.00.
The tolerance for those bearings is +0 / -0.009mm for the outer ring, so if your measurement of 27.9mm is accurate, they are out of tolerance. Are they a name brand bearing, or a no-name bearing you got cheap?
 
The two things I've seen mentioned above that will give you the best results are:

1 give the bore time to cool down between cuts. Temperature is crucial because of thermal expansion.

2 telescoping guages will eliminate micrometer inaccuracy, because you use the same micrometer for the bearing as well as the bearing bore/surface. In that case, the bearing bore is becoming relative to the bearing itself, and the fits can be defined off of that.

Sent from my SM-G970U using Tapatalk
 
The universally accepted standard temperature for measuring machined parts for accuracy is 68*F or 20*C. Commercial shops are generally temperature controlled to this standard, and parts are allowed to cool to room temperature before being inspected.
 
The two things I've seen mentioned above that will give you the best results are:

1 give the bore time to cool down between cuts. Temperature is crucial because of thermal expansion.

2 telescoping guages will eliminate micrometer inaccuracy, because you use the same micrometer for the bearing as well as the bearing bore/surface. In that case, the bearing bore is becoming relative to the bearing itself, and the fits can be defined off of that.

Sent from my SM-G970U using Tapatalk
Don’t think he has room for a telescoping gage in this situation.
 
Perhaps i used the wrong term. Snap gauges.

28mm is larger than an inch, and my telescoping/ snap gauge set has smaller than an inch, so i think it would be covered. Perhaps the storage of the set is in question.

Sent from my SM-G970U using Tapatalk
 
Measure the bearing using an external micrometer. Reduce the micrometer by 0.01 mm or 0.0005" for a snug/tight/press fit. Now you know the target "diameter". Use the micrometer to check the bore gauge.
Finish turning by 3 equal passes using the same feed and speed. Check the diameter at each pass.
If you have a lose fit, use locktide to secure the bearing.
If the fit is to tight, and you did not take out the part from the chuck, you can do a spring pass at a 0.02 mm (0.001") reduced diameter. This is a rubbing pass that can be repeated until the bearing fits. If you did take out the part, sand paper can do the job.
If you want to chamfer the edge, do it before the last turning pass or the chamfer will create a burr. Basically chamfering is not needed if your bearing has a chamfered edge!
Measuring inner diameters takes some practice. Turning inner diameters can be a challenge because the long and often small turning tool will flex a lot. Let the tool stick out as short as possible.
 
I always use the torque thing to make sure it's not over-tightened. If I'm very careful this is however the most accurate for outside dimensions right now.
Opinions may vary but I feel like the torque thing (ratchet knob or stop is the name of it) on my mics don’t slip until they’re much tighter than I like. I only close the mic until it com off of the surface being measured with light resistance. The ratchet knob tightens them so tight that they’re hard to remove from the workpiece. High quality bullets (not ammunition; just the bare bullet) are manufactured to very high standards. Not having pin gauges, I measured a bullet over and over until I learned how much pressure was required to consistently measure 0.308”. It was pretty easy to develop the feel.
 
Opinions may vary but I feel like the torque thing (ratchet knob or stop is the name of it) on my mics don’t slip until they’re much tighter than I like. I only close the mic until it com off of the surface being measured with light resistance. The ratchet knob tightens them so tight that they’re hard to remove from the workpiece. High quality bullets (not ammunition; just the bare bullet) are manufactured to very high standards. Not having pin gauges, I measured a bullet over and over until I learned how much pressure was required to consistently measure 0.308”. It was pretty easy to develop the feel.
Do you think it would be a good idea to clean/repair the "torque thing" in case that might return the subject micrometer to fit-for-purpose condition?
 
Back
Top