Just about every lab has embraced the modern molecular testing age. But when you buy that new bit of equipment do you stop to consider the implications on the test sensitivity and specificity? Every time the manufacturer "updates" the software, do you re-standardise the test? Do you check for failure over time of the optical system behind that melt curve? (lamps decay over time). Do you re-calibrate the test EVERY time you open a new batch of chemicals? Do you cross-check the sensitivity and specificity of all of the machines in your lab? Some manufacturers machines are better at detection than others. Do you routinely test for lab contamination of reagents (especially problematic in University labs, I found)?

I used to do lab audits - but see https://www.cochrane.org/news/how-accurate-are-routine-laboratory-tests-diagnosis-covid-19 as an example. Of 67 tests, only three of the tests had both sensitivity and specificity over 50%. That's "heads or tails" territory.

If you want to lift your game - read Bustin S.A., Benes V., Garson J.A., et al. The MIQE guidelines: Minimum information for publication of quantitative real-time PCR experiments. Clinical Chemistry. 2009;55(4):611–622 and Smith M. Validating Real-Time Polymerase Chain Reaction (PCR) Assays. Encyclopedia of Virology. 2021:35–44.

More John Brian Jones's questions See All
Similar questions and discussions