The number of digits used to perform a given computation. The concepts of accuracy and precision are both closely related and often confused. While the accuracy of a number is given by the number of significant decimal (or other) digits to the right of the decimal point in , the precision of is the total number of significant decimal (or other) digits.
In many programming language, numerical computations are done with some fixed precision.