Lead Error

Then you are correct. Wikipedia is out of date. I thought that was odd because we have been laying down some really small magnetic strips on hard drives for years. I have no idea how they work. If its magnetic why doesn't a magnet damage it? More research is needed. If you will tell me the manufacturer of your scales I will see that Wikipedia is updated. They depend on public input. Thanks, David

A magnet will damage the magnetic scales. A magnet will not damage the capacitive type. I have both Renishaw and Ditron (Chinese Renishaw knockoff) The Ditron is available in 0.5 micron, but I think the Renishaw is available in a 0.1 or 0.2 micron resolution, can't remember.

As I recall, the magnetic tape has a ''magnetic stripe'' every 2mm, and the read head has 2 Hall sensors. Those output a sin/cos wave to the internal electronics which converts that to a quadrature pulse stream. Overall they work very well and are almost bullet proof.
 
It appears that most rolled ball screws in the "affordable" category have a lead error in the .002"-.003" per ft range. Am I correct in assuming this error is negated when using a DRO?
That 'error' might not be a shape fault; steel expands when it warms by 6 parts per million
per Fahrenheit; that's 0.002" per foot when the temperature change is 30 degrees F. In an unheated shop,
that's just a given. If the workpiece and the screw that guides the cut are both at the
same temperature, that 'error' vanishes. This is part of the reason a good micrometer has a
plastic grip; wouldn't want body temperature to change the tool accuracy.

A DRO with glass scales has slightly lower thermal expansion than steel (about 2.2 ppm/F for borosilicate "Pyrex").
 
OK, a guy has three 0-1" micrometers, a Starrett, a Shars, and a Harbor Freight. He measures a piece of 1" drill rod with all three mics, very carefully, until he is satisfied with the readings he is getting from each:

Starrett = 1.0000"
Shars = .9996"
Harbor Freight = 1.0002"

Which one is correct?


Probably, none of them!
But as Bob said:

There will always be some associated error. The objective is to make any measurement error in our metrology tools less than the tolerable errors in our work.
 
Which one is correct, and how did you determine it? It appears that you are assuming that the DRO is correct and that the dials are wrong, not a wise idea.

The dial indicator agrees with a micrometer when measuring gauge blocks on a surface plate. So, I will take the DI reading over the reading from the dials.

I do not have DRO (yet.)
 
Not really, I think none are correct as Bob said.
Not quite what I said...
The reality is that we have no proven idea if any of them are even close. Having three mics that are as close to each other as those are would probably give us confidence in our home shops that we were in the ball park for meeting 1.000" with a .001" tolerance, but in reality, if the mics have not been properly calibrated recently to a KNOWN accurate standard, using proper metrology techniques in a climate controlled test facility, then we are really just guessing and hoping. As always, it depends on what we are making. The vast majority of the work most of us do does not need tolerances closer than we can read with a 6" scale, and most of our stuff only needs to fit their mating parts, not hit any specific number. Interchangeability of parts and fits with other parts made half way around the world is where most of the calibrating and testing of measuring tools becomes important. We often care more about a nice sliding or press fit than we do about the actual sizes of the parts.
 
Micrometer standards are good for checking/calibrating dials, DROs and Travadials. They are rarely used, so likely to be the most accurate standard available to us mere mortals. The longer, the better.
 
Not quite what I said...
The reality is that we have no proven idea if any of them are even close. Having three mics that are as close to each other as those are would probably give us confidence in our home shops that we were in the ball park for meeting 1.000" with a .001" tolerance, but in reality, if the mics have not been properly calibrated recently to a KNOWN accurate standard, using proper metrology techniques in a climate controlled test facility, then we are really just guessing and hoping. As always, it depends on what we are making. The vast majority of the work most of us do does not need tolerances closer than we can read with a 6" scale, and most of our stuff only needs to fit their mating parts, not hit any specific number. Interchangeability of parts and fits with other parts made half way around the world is where most of the calibrating and testing of measuring tools becomes important. We often care more about a nice sliding or press fit than we do about the actual sizes of the parts.
Exactly! What Bob said this time.
 
Back
Top