An assay uses 3 repeats to obtain a mean to give a number value for a batch (X) of analyte (it is a biological so the assay is complex and time consuming). An SD has been obtained on this batch (X) of analyte for the assay by 100 repeats of the assay over time. All the literature references to allowable TE is a measure of 2xSD for a 95% CI (ignore bias for now). The argument is presented to me that the 100 assays should give lots of confidence that they can hit target. I.e. that a SE (SD/SQRT of 100) x2, is more appropriate to calculate the variation of the assay than using SDx2.

Advice and arguments either way please before I lose my sanity.

Thanks.

Similar questions and discussions