I'm quantifying proteins on western blots, using chemiluminescent ECL primer and a CCD-based imager (Carestream) to visualize the bands. I can use exposure times of seconds to minutes. The shorter exposure times give greater dynamic range between "low" and "high" signal bands, but much more noise on the densitometric plot (on both peaks and background, via ImageJ). Longer exposure times give very nice looking bands, no noise on densitometry, but a lower dynamic range. Looking at the histograms, the longer exposure times don't seem to be over-saturated, with no pixels showing at the greatest intensity.

Is there a way to determine the best exposure time? Am I on the right track, looking at the histograms, etc?

Any suggestions would be greatly appreciated.

More Brad Huffington's questions See All
Similar questions and discussions