Hmm....
I would say it like this:
Take a look at the tachometer in your vehicle (assumes you have both a vehicle and a tachometer, if you don't imagine one

). The scale on a tachometer usually reads in whole numbers from 1 to about 7, then at the bottom, it says x1000. There may be tick marks between the 1 and the 2, oftentimes there are three tickmarks, dividing RPM into 250rev chunks. Now, if the needle is between 1 1/2 and 1 3/4 one could could assume that your RPM range from 1500 to 1750, not very precise, however if your RPM are within that range, quite accurate.
Now imagine that there are 7 tickmarks between the 1 and the 2, dividing RPM into 125rev chunks. Now, if the needle is just below the 1 5/8 mark, one could assume that the RPM are just under 1625. If this is true, your tachometer is now more precise and more accurate. However, if your needle is just below the 1 5/8 mark but your
actual RPM are right at 1630, the guage may be more precise, but less accurate.
Generally speaking, as precision increases, accuracy decreases but this is somewhat subjective. The key is to choose the proper scale for what you are measuring. Don't measure the distance to Cuba in 16ths of an inch. Miles work better, and rounding to the nearest 10 is probably best. 90 miles to cuba is better than 92 mi 3456' 2 7/16". Likewise, when measuring something small, i.e. monitor size, a smaller scale makes more sense, 15.4" makes much more sense than 1' when the
'actual' size is 15.40125". Then, of course, there is alway the nominal size; you don't call a 2x4 a 1 1/2" x 3 1/2" because that's plain dumb. So, the short answer to your question may be, "There is none."
v/r
Gooser
Why do today
that which may not need to be done tomorrow
![[ponder] [ponder] [ponder]](/data/assets/smilies/ponder.gif)
--me