Why would an electronic outside micrometer not return to zero?

I think precision measurement is an area where you tend to get what you pay for. I was excited to find a 50 millionths digital dial indicator for $35. When I got it, it worked great. Then the weather turned colder. I discovered that if the shop temperature was below about 60 degrees F, the numbers just displayed randomly continuously. :) I'm just waiting for spring, I guess.

This may be a possible explanation, at least with regard to the tool behaving differently at temperatures much below 20°C. 99% of the measurements are taken in the shop, and it is probably about 50° to 60° F out there.
 
Last edited:
Erik, your post is really confusing. What do you mean by the underlined statement?

Perhaps more to the point, have you put the mic in a stand and used a gauge block to see if it reads what it should read? Even a Grade B block will be good enough to tell you if the mic is accurate or not. I cannot imagine that an entire set of mics would be this inaccurate.

If you want a good digital mic, buy a Mitutoyo mic. I have a Quantumike that is accurate to half a tenth and I have confirmed it with accurate gauge blocks. My Mitutoyo 500-752-20 calipers are dead on within its calibration range. For digital tools, Mitutoyo is probably the best out there.

I used my Shars Grade B gage block set to investigate the situation, but as of yet I do not own a micrometer stand. The gage block set may have its own issues, but I want to take a more extensive look at that with metrology equipment that I trust before I comment further on that.

Another way of explaining the underlined statement is that when I bought the micrometers, I had hoped that it would be a micrometer where I could obtain consistent measurements down to tenths. Under these circumstances, I could get repeatable measurements to the nearest tenth. After realizing that this is an economy micrometer, I now hope that it will one day be able to give consistent measurements down to thousandths; under these circumstances, I would get consistent, repeatable measurements down to the nearest thousandths.
 
I tried 4 different cheap digital micrometers, and settled on a particular brand, shahe, after testing all 4.
I import micrometers.

The cheapest ones, 3 of, have problems in frame bend, and in repeatability.
Errors tend to be 2-3 microns, with no clear reason, aka they are sloppy and poorly built.

The very very good shahe micrometers I import and sell cost only 35$ more.
The basic poor ones might cost 30$, 1", and the excellent one is 65$ (EU with 22% VAT).
The excellent mitutoy or federal or mahr about 150$.

The micrometers I sell and use repeat to about 1 micron and are accurate to about 1 micron, by blind testing with gage pins.
Equal to the best japanese mitutoyo/mahr/federal.
 
The micrometers I sell and use repeat to about 1 micron and are accurate to about 1 micron, by blind testing with gage pins.
Equal to the best japanese mitutoyo/mahr/federal.

For how long, I wonder?
 
You machinists who bought digital electronic micrometers, may I ask why you chose that over analog? Maybe you save some time?
 
A modern micrometer should be able to provide consistant readings to within a couple of tenths. It doesn't take much debris on the surface of the jaws to create inconsistent zeroing. In a metrology class, we were taught to clean the jaws by closing on a piece of paper and pulling the paper out, wiping the jaws.

On my mikes with ratchets, I always ratchet more than two clicks, maybe a dozen or more. The purpose of the ratchet is to limit the operator applied torque and properly set up, it should do that. Using only two clicks should be fine for seating the jaw on the anvil but as soon as you put an object between the jaws, there is a possibility for canting the micrometer and getting a too high reading. Rotating the ratchet as the micrometer gets settled in tends to provide more consistent readings for me.

For a 1" piece of steel to grow in length by 1 mil due to thermal expansion, the temperature increase would have to be 86ºC or 155ºF. (Thermal expansion coefficient of steel is 10.8 - 12.5 microinches/inch/ºC). Obviously, you don't want to make precision measurements on a workpiece that's smoking hot from heavy machining but for most work, slight variations due to temperature differences can be ignored.

Aluminum's CTE is about double that of steel and some plastics have 4 to 5 times that of steel so you should use a little more care when measuring those materials.
 
So trying to make the change as large as possible within reason, if you measure 4.0000” OD on a steel round bar in a shop which is 0°C, what would that measure in an inspection lab at 20°C?

12.5 microinches X 20 = 250 microinches = .000250” per inch expansion going from 0° to 20°C, so the OD has increased by 4 X .00025” = .0010” in this example.

Here we have a machinist who is trying to nail 4.0000” OD with no tolerance specified. He does his work in an extremely cold shop, hopes that his work is good enough, and when the work gets the inspection lab it measures 4.0010”.
 
Last edited:
The biggest plus in my book for digital calipers and micrometers over their analog counterparts is the ability to set a zero at any point. Second would be the ability to switch from mm to inches and back. Another plus is ease of reading.

That said, my go-to micrometer is a B&S 1" mechanical digital micrometer. My 1" - 6" micrometer set is analog so anything over 1" get measured with an analog micrometer. For calipers, I usually use digital calipers for the reasons stated. I have a Starrett 24" vernier caliper that I use for measurements over 12" and a couple of German vernier calipers, the latter seldom used.
 
So trying to make the change as large as possible within reason, if you measure 4.0000” OD on a steel round bar in a shop which is 0°C, what would that measure in an inspection lab at 20°C?

12.5 microinches X 20 = 250 microinches = .000250” per inch, so the OD has increased by 4 X .00025” = .0010” in this example.
Correct. If you were fitting a bearing, you would probably be concerned. Metrology labs operate in temperature controlled environments and will let the objects equilibrate before measuring because they are usually dealing with accuracies in the microinch range. Calibration equipment and standards should be 4 - 10 x more accurate than the objects being calibrated.
 
Back
Top