Originally Posted by rickt300


It is a good thing you don't have to measure parts to that close a tolerance!


Well in reality 1 millionth of a meter comes out as .0000393, a micron measured by the inch comes out as .0000254



Actually I have personally measured down to +/- less than a micron, and I have seen measurements performed to a higher degree of precision than that. You on the other hand have already said that you have no idea how you'd measure to those sorts of levels of precision, which rather makes a nonsense of your assertion:

Originally Posted by rickt300
.001mm measures .000025 of an inch, or 25 millionths of an inch, also called a micron. To get a surface finish near a 25th of a 10 thousandth, or .000025 of an inch we would have to send the part out to grind as the surface finish on the part after machining in a lathe would be too rough to measure a micron. In fact I have no idea how you would measure a micron in a machine shop. That said I have never seen the micron symbol on a CNC lathe set to the metric measure. This illustrates the out of sync dimensions of metric. The micron being too small, the millimeter being too large so you have to break it down, the centimeter a semi useful measurement and the meter virtually never used. So yes a micron is smaller than .0001. That said if the machine can measure precisely enough for microns it can also measure .00001, however how a shop would measure the finished part to that tolerance I have no idea.


Your assertion that 0.001 mm is not "close to .00004" but rather is .0000393" is a good example of false precision, or spurious precision, and it isn't the first example in your reasoning. Each of .001 mm and .00004" has one significant figure. They are an "apples to apples" comparison, where .001 vs .0000393" is not. .Another way of putting it is that if the best you can measure to is a hundred thousandth of an inch, it is spurious to report to a ten millionth, and nonsense to argue that 7 ten millionths makes any sort of difference. .