Hello community,
I'm currently working on a fluorescence based assay and experience recovery issues in my samples at low analyte levels. I noticed big residual deviations from the lowest calibration data point. That's why I want to optimize my calibration curve and force it through the lowest calibration level to minimize this problem and get more realistic values. I'm aware of the fact that this is far away from good scientific practice but it's better than calculating negative concentration values above the lowest standard point, completely distorting my results.
Of course an additional low-level calibration curve would have been the better option... but I can't repeat these past experiments as I have a lot more to do during my thesis and I know that this approach gives me enough evidence to proceed my future experiments.
My options for calibration are MS Excel, GraphPad and R (I'm not familiar with coding yet). I can think of a tedious workaround in MS Excel by transforming the lowest data point to the origin and fixing it there using MS Excels built in option. But i'd be happier using a more elegant way to perform this.
Thank you!