The number of digits used to perform a given computation. The concepts of accuracy and precision are both closely related and often confused. While the accuracy of a number x is given by the number of significant decimal (or other) digits to the right of the decimal point in x, the precision of x is the total number of significant decimal (or other) digits.

In many programming language, numerical computations are done with some fixed precision.

See also

Absolute Error, Accuracy, Arbitrary Precision, Error Propagation, Fixed Precision, Relative Error, Significance Arithmetic, Significant Digits

Explore with Wolfram|Alpha

Cite this as:

Weisstein, Eric W. "Precision." From MathWorld--A Wolfram Web Resource.

Subject classifications