Klaas Robers wrote:The current calculator is far too precise. Who needs all those decimals?
True, when you consider that most electronic devices use 10% resistors and often more than 20% capacitors it all takes on a sense of perspective. Discrete transistor gain...we won't even go there! A lot of variations are ironed out by negative feedback, but that's also subject to component tolerances. Precision, should you need it, is often the realm of crystals and number-crunching chips. I believe the entire Apollo program was calculated to less than eight decimal places, and in large parts less than four.
There are of course many places where accuracy is required. If you set up chat/schedule on 144MHz there's little chance of success if you're 25kHz off-frequency, or about 0.017% off.
There are times and places where you may need precision, but often not, in a large percentage of cases you're more interested in the
trend rather than the absolute numbers.
For example I've often used the Maxim BS18B20 direct-to-digital temperature sensor. It looks for all the world like a BC547 but outputs its temperature with a resolution of one-sixteenth of a degree Celcius, but with an accuracy of half a degree between -10 and +85. Greater outside those limits. So you can see the trend even if you're not absolutely sure of the actual measurement.
In other words, how much accuracy is actually required?
Steve A.
Some years ago I met an American guy in my local pub. He was confounded by degrees Celcius in weather forecasting. I told him a simple rough rule-of-thumb...double the Celcius then add 30. It's near enough to know if it's going to be freezing, cold, cool, nice, warm, hot or ***** hot. An example where absolute accuracy isn't really required. If you can multiply Celcius by 1.8 and then add 32 in your head after a few beers, you're better than I!