A ten thousandths is smaller (less than) a thousandth.
In machining, we often refer to a ten thousandths measure as a "ten." To say something like you want to remove 0.0005", you would say "I will removes 5 'tens'."
a thousandth = 1/1000
a ten thousandth = 1/10,000
Your formula 0.0010- 0.001= 0 is an entirely different animal and has to do with significant figures (accuracy).
to be clear 0.001 = 0.0010 mathematically only
0.001 =/= 0.0010 empirically, they are not the same thing
In any field that uses sig figures (chemistry, engineering, physics, etc), the last digit in a number is considered to be inaccurate.
So if you have a micrometer that is precise to 0.001", it means that you can only assume the hundredths place is accurate. In other words, if the measurement is 0.017, you can only assume that the actual measurement is 0.01 +/- error.
Good thousandths indicators are accurate to 0.0010". the last digit (the ten thousandths place) is considered to be inaccurate and it will only accurately measure to the thousandth place.