After validation of the bioanalytical method when we measure plasma samples we observe sometimes by LCMSMS variation of retention times for both compound and internal standard . Which variation is acceptable ?
Retention time variations tend to be monitored in your system suitability check and the criteria for these can be deduced from your validation experiments, I'd say. These data will indicate which variation is normal (acceptable) for your chromatographic method and when you should start to look into the causes of retention time variation (in case the system suitability criteria are not met).
What variation is normal/acceptable also depends on the type of LC (HPLC/UPLC, for example) you're running and the length of your run. What type are you running (gradient) and what kind of variation are you observing?
Hi, I agree with Eef. Rt might be varied, if the conditions of chromatography are not exactly identical. I think, if the Rts of standard(s) and IS were identically with Rts of target compounds and IS in samples, it meant your system should be OK. I think by applying SRM or MRM, you can easily confirm the identity and purity of your target compound(s) and IS in samples.
As mentioned Rt variation should be part of your system suitability.
Also, separate but related to this topic are the settings used for quantification in your calibration table. Rt's are rarely exactly the same values, even under "perfect" conditions. The normal variance should be established early in your method development process and also be part of the calibration table settings for each std. 2.5% is a common value used by regulatory agencies, but you really need to choose a value that is appropriate to your specific method (and the use of a % for Rt, is discouraged, as it has problems because some runs are very short, while others are very long). The calibration table should have a place where you enter what the acceptable identification tolerance is for Rt variation during the run (this is needed for correct peak identification). For each std, a % of the Rt is sometimes used or better still, a start and stop time (peak windows) are selected. This should be part of an SOP that you have in-house.
it depends on what kind of experiment you are performing. if it is a qualitative or specific identification test without internal standard material, the RT is of great importance. the Acceptance criteria should be strict. if it is a quantitative experiment with IS, I think RT consistent throught a single run is OK.
Small variations in Rt are normal, but you should rule out retention time variations which are the result of a problem with the method or sample prep. Here is a link to an article on the topic of retention time variation which provides common causes / reasons for is as well some troubleshooting suggestions.
"HPLC Retention Time Drift, Change, Variability or Poor Reproducibility. Common Reasons for it"
In LC, the difference of the retention time of the unknown analyte and the reference standard should not exceed 5 %. This is 2.5 % in the case a relative retention time as calculated based upon an internal standard.