04 November 2017 6 428 Report

Hi All,

I am working on the development of an indirect ELISA in which a certain antigen is coated to the plate in order to detect serum antibodies against that antigen. I am in the process of optimizing the concentration of the antigen coating and the detection antibody concentration using a chessboard approach. I would like to select these concentrations in such a way that a plateau is reached of about 2.0 (OD). Currently I have performed two experiments in which I have titrated both the coating antigen as well as the detection antibody, I also included a titration a higly positive and negative serum as well. The problem I now have is that at low serum dilutions of my positive serum (1/10; 1/50; 1/250) the OD signal is higher than the machine is able to measure (>3; signal overflow), even at 1/80.000 dilution of my detection antibody and a coating concentration of 0.125 ug/ml. Thus, I am not able to reach a plateau within the measuring range of the machine. I am wondering what I could do to reduce the OD signal so that it comes within the dynamic range of the machine. I could try to dilute coating and/or antibody even more, but wouldn't I then risk to lose sensitivity with serum samples that are less positive compared to the high antibody titer serum sample I am using now for optimization purposes? Or might it help to change to a less sensitive TMB substrate? Or reduce the TMB incubation time?

Any advice is much appreciated. Thanks!

Similar questions and discussions