Lead Error

Acording to Wikipedia, the magnetic scales are limited to .0005 resolution in current technologies. Given the state of integrated circuit manufacturing I can see how etched lines, on glass or some other media, of less then .00001" could come if not already here.

Wikipedia may be a bit out of date. Magnetic scales are available to 0.5 micron (about 0.00002'') resolution, I have 1 micron (about 0.00004'') scales on my machines.
 
OK, a guy has three 0-1" micrometers, a Starrett, a Shars, and a Harbor Freight. He measures a piece of 1" drill rod with all three mics, very carefully, until he is satisfied with the readings he is getting from each:

Starrett = 1.0000"
Shars = .9996"
Harbor Freight = 1.0002"

Which one is correct?
 
Bob, I think he was comparing his machines dial to a dial indicator. No DRO here.
A dial indicator is not by any means highly accurate for measuring distance. Also, any variation in the angle with which it approaches the work causes cosine error.
 
Last edited:
Guys, I am playing devil's advocate in this thread. To do things accurately, or even know if we have the means to do so, and to what tolerance, is something that all machinists should understand and consider when they are measuring things. Sometimes it matters, often it does not. We also need to have some "real" idea of the tolerances we are likely working with and for everything we measure. Nothing is perfect, and every measurement has multiple ways it can go wrong. Now deal with it, while also understanding it...
 
Last edited:
Wikipedia may be a bit out of date. Magnetic scales are available to 0.5 micron (about 0.00002'') resolution, I have 1 micron (about 0.00004'') scales on my machines.
Then you are correct. Wikipedia is out of date. I thought that was odd because we have been laying down some really small magnetic strips on hard drives for years. I have no idea how they work. If its magnetic why doesn't a magnet damage it? More research is needed. If you will tell me the manufacturer of your scales I will see that Wikipedia is updated. They depend on public input. Thanks, David
 
Bob I get what you are saying, basically you need a known standard to compare to. As an example a gauge block or such that is certified to be a given size so that there is a known standard to base all measurements from. Is that correct.
 
Bob I get what you are saying, basically you need a known standard to compare to. As an example a gauge block or such that is certified to be a given size so that there is a known standard to base all measurements from. Is that correct.
That would be the hard way. How do you check your gauge blocks? Probably with your most trusted micrometer. Or maybe the other way around. So cut some stock to a given size as verified by that gauge block proven micrometer, cut off another few thousands by the DRO and check again. If it's within the specs of your DRO then you're golden. If not you may need a new micrometer or gauge block. HA!
 
Bob I get what you are saying, basically you need a known standard to compare to. As an example a gauge block or such that is certified to be a given size so that there is a known standard to base all measurements from. Is that correct.
Exactly, though every standard also has a tolerance range. The reality is that we never "know for sure" that we have perfection, and on some level, we never will. We now use wavelengths of light as standards, but cannot measure them accurately and repeatably. But we can know that we are within a certain tolerance of accuracy, or rather that we have equipment that is certified within a certain tolerance, and work our way from there. Take no measurement, no angle, no weight, nothing as perfect. Practicality is another thing altogether, and if we want to get something done, we do what we need to do to get there, as best as we can and based on our needs.
 
Thanks for the reply Bob. Your comments prompted me to do some research. NOW, I realize that my original question was so obvious it should have not been asked. Your statement about DRO's needing to be calibrated with a know standard got my attention. Like, what have I got myself into? I have a DRO for my mill that has not been installed. So I pulled out the manual and the only calibration, other then settings the number of digits to display, was the "Direction". Choose Left or Right, no explanation. So, off to Wikipedia I go. Here's what I learned. Glass scales have eched lines every .0002" and the reader has two sensors. The order the sensors pass the etched marks determines the direction. The scales coud be mounted upside down and backwards so you have to tell the controller which sensor is first. Therefore the only calibration in my manual is "Direction". Thank you very much for forcing me to do the work. Now I feel I can install my DRO.

Acording to Wikipedia, the magnetic scales are limited to .0005 resolution in current technologies. Given the state of integrated circuit manufacturing I can see how etched lines, on glass or some other media, of less then .00001" could come if not already here.
Glass scales are actually ruled in microns, commonly either 5 micron or 1 micron. 5 microns = .00019685" and 1 micron = .00003937". When in inch mode, the metric value is converted to inches to the nearest .0002" for the 5 micron scales. If you watch carefully, you will notice a discontinuity due to rounding error in the reading as you move away from your reference point. The error is at most .0001" and is not cumulative so for all intents and purposes, it can be ignored. However, if you want to subcontract for NASA, you may want to work in metric measurement.

You should be able to calibrate your scales through the setup menu. On my Grizzly scales, it is called linear comp."
 
Exactly, though every standard also has a tolerance range. The reality is that we never "know for sure" that we have perfection, and on some level, we never will. We now use wavelengths of light as standards, but cannot measure them accurately and repeatably. But we can know that we are within a certain tolerance of accuracy, or rather that we have equipment that is certified within a certain tolerance, and work our way from there. Take no measurement, no angle, no weight, nothing as perfect. Practicality is another thing altogether, and if we want to get something done, we do what we need to do to get there, as best as we can and based on our needs.
Actually, we can measure the wavelength of light very accurately and with repeatability. That is why the definition of the standard meter from 1960 to 1985 was based on the wavelength of Krypton 86 with a relative uncertainty of .01 ppm. It was replaced in 1985 by the current definition based on the distance a beam of laser light travels in a vacuum in 1/299,792,548 seconds with a relative uncertainty of .001 ppm.

However, you make a very good point and one worth remembering. Nothing can be measured with absolute certainty. There will always be some associated error. The objective is to make any measurement error in our metrology tools less than the tolerable errors in our work.

Many years ago, over fifty, I was working on an optics class experiment measuring the wavelength of light using a Fabry Perot interferometer. It is capable of measuring a distance to one tenth of a wavelength, enough to warrant a Nobel prize in 1907. For visible light, with a wavelength of around 500 nm that would amount to an uncertainty of 50 nm or .05 microns. Not that far from the 1 micron precision glass scales.
 
Back
Top