If you are asking how many digits should have been published, with regard to accuracy, then I think most would assume that there is a reasonable chance that the last digit to the right should be off by no more than 1 or 2, such that, for example, 71.026 should have at least a 50% chance of really being somewhere between 71.024 and 71.028, inclusive. Ideally one would expect substantially better than that, but in reality, data you see may be much more misleading, if anything like many of the official statistics from US government statistical and related agencies. Before deciding how many digits may be reasonably considered valid for you to 'trust' from data you are given, or data you are to report, you should find out what you can about sampling and nonsampling error, about variance and bias. If you have a confidence interval for a number, the actual accuracy is liable to be worse. (Note that for a census, there is no sampling error, but there may be a great deal of measurement error.) So, if you had an estimate of 71.026, and a standard error of 0.673, then you know too many digits were reported (perhaps 71 would have been somewhat reasonable to report). If you wanted to know if such a number had changed from a previously reported value of 71.389 to 71.026 here, then the answer would be that you can not say with any reasonable certainty, and that in fact those numbers appear to be virtually identical.
I assume that you may have similar challenges with pharmaceutical statistics.
If you are asking how many digits should have been published, with regard to accuracy, then I think most would assume that there is a reasonable chance that the last digit to the right should be off by no more than 1 or 2, such that, for example, 71.026 should have at least a 50% chance of really being somewhere between 71.024 and 71.028, inclusive. Ideally one would expect substantially better than that, but in reality, data you see may be much more misleading, if anything like many of the official statistics from US government statistical and related agencies. Before deciding how many digits may be reasonably considered valid for you to 'trust' from data you are given, or data you are to report, you should find out what you can about sampling and nonsampling error, about variance and bias. If you have a confidence interval for a number, the actual accuracy is liable to be worse. (Note that for a census, there is no sampling error, but there may be a great deal of measurement error.) So, if you had an estimate of 71.026, and a standard error of 0.673, then you know too many digits were reported (perhaps 71 would have been somewhat reasonable to report). If you wanted to know if such a number had changed from a previously reported value of 71.389 to 71.026 here, then the answer would be that you can not say with any reasonable certainty, and that in fact those numbers appear to be virtually identical.
I assume that you may have similar challenges with pharmaceutical statistics.
Significant figures (or significant digits) are used to express, in an approximate way, the
precision or uncertainty associated with a reported numerical result. In a sense, this is the most
general way to express “how well” a number is known. The correct use of significant figures is
important in today’s world, where spreadsheets, handheld calculators, and instrumental digital
readouts are capable of generating numbers to almost any degree of apparent precision, whichmay be much different than the actual precision associated with a measurement. A few simple
rules will allow us to express results with the correct number of significant figures or digits. The
aim of these rules is to ensure that the final result should never contain any more significant
figures than the least precise data used to calculate it. This makes intuitive as well as scientific
sense: a result is only as good as the data that is used to calculate it (or more popularly, “garbage
in, garbage out”).
Definitions and Rules for Significant Figures
• All non-zero digits are significant.
• The most significant digit in a reported result is the left-most non-zero digit: 359.741
(3 is the most significant digit).
• If there is a decimal point, the least significant digit in a reported result is the rightmost
digit (whether zero or not): 359.741 (1 is the least significant digit). If there is
no decimal point present, the right-most non-zero digit is the least significant digit.
• The number of digits between and including the most and least significant digit is the
number of significant digits in the result: 359.741 (there are six significant digits).
The following table gives examples of these definitions:
Number Sig.
Digits
A 1.2345 g 5
B 12.3456 g 6
C 012.3 mg 3
D 12.3 mg 3
E 12.30 mg 4
F 12.030 mg 5
G 99.97 % 4
H 100.02 % 5
Significant Figures in Calculated Results
Most analytical results in ORA laboratories are obtained by arithmetic combinations of numbers:
addition, subtraction, multiplication, and division. The proper number of digits used to express
the result can be easily obtained in all cases by remembering the principle stated above:
numerical results are reported with a precision near that of the least precise numerical
measurement used to generate the number
Addition and Subtraction
The general guideline when adding and subtracting numbers is that the answer should have
decimal places equal to that of the component with the least number of decimal places:
21.1
2.037
6.13
________
29.267 = 29.3, since component 21.1 has the least number of decimal places
Multiplication and Division
The general guideline is that the answer has the same number of significant figures as the
number with the fewest significant figures:
56 X 0.003462 X 43.72
1.684
A calculator yields an answer of 4.975740998 = 5.0, since one of the measurements has only two