Our experience tells us that the intensity of immunoreactive bands on luminograms, after chemiluminescent ECL advanced detection does not always correlate with the protein load.
In my experience western blots and quantification should rarely be used in the same sentence. Way too many assumptions have to made when using this method, the most striking of which is that your experimental conditions do not affect the avidity of the antibody for the substrate protein. For example if I treat a cell with compound A, will this result in modification(s) of the protein of interest altering the epitope or access to the epitope that the antibody recognizes.
Even if we assume all things equal, the reliability of any relative quantification of bands on a western blot is dependent on other factors such as the abundance of this protein and the percent difference between each band. I find only very "dramatic" changes (2 fold differences by densitometry analysis) are reliable. Nevertheless its not unheard of to get a 1.5 fold difference consistently.
The problem is that the conventional way of screening using ECL and a photographic film for detecting the generated light rather easily saturates. (If you reach maximum 'blackness' on the film, you cannot quantify your samples anymore). The Licor company sells a machine ("odyssey") that uses a scanner to detect IRDye labeled antibodies by fluorescence. Because you can limit exposure, you are unlikely to run into overexposure issues. This technique does not remedy the aspecific binding of antibodies or other antibody related artefacts, of course.
In my experience western blots and quantification should rarely be used in the same sentence. Way too many assumptions have to made when using this method, the most striking of which is that your experimental conditions do not affect the avidity of the antibody for the substrate protein. For example if I treat a cell with compound A, will this result in modification(s) of the protein of interest altering the epitope or access to the epitope that the antibody recognizes.
Even if we assume all things equal, the reliability of any relative quantification of bands on a western blot is dependent on other factors such as the abundance of this protein and the percent difference between each band. I find only very "dramatic" changes (2 fold differences by densitometry analysis) are reliable. Nevertheless its not unheard of to get a 1.5 fold difference consistently.
like anything you probably need a good normalizing control. if you normalize with house keeping genes , and dont overexpose (the band has to be transparent) then I think the relative quantification is good. but you cant compare absolute values between two different proteins only the change compared to control.
in any case the new ECL detection machines (biorad, GE) do a much better job in creating images that can be quantified than the regular film blots.
main thing is to make sure the band is not oversaturated.
this is easy on the digital machines which can detect oversaturation, plus they have a larger linear range of detection compared to film.
for film, best way is to take multiple exposures, and then plot their density on a graph to get the linear range. such quantification gives a pretty good measure of levels, though anything below 10-20% differences are very hard to detect because of inherent variability in a western.