Our experience tells us that the intensity of immunoreactive bands on luminograms, after chemiluminescent ECL advanced detection does not always correlate with the protein load.
In my experience western blots and quantification should rarely be used in the same sentence. Way too many assumptions have to made when using this method, the most striking of which is that your experimental conditions do not affect the avidity of the antibody for the substrate protein. For example if I treat a cell with compound A, will this result in modification(s) of the protein of interest altering the epitope or access to the epitope that the antibody recognizes.
Even if we assume all things equal, the reliability of any relative quantification of bands on a western blot is dependent on other factors such as the abundance of this protein and the percent difference between each band. I find only very "dramatic" changes (2 fold differences by densitometry analysis) are reliable. Nevertheless its not unheard of to get a 1.5 fold difference consistently.
The problem is that the conventional way of screening using ECL and a photographic film for detecting the generated light rather easily saturates. (If you reach maximum 'blackness' on the film, you cannot quantify your samples anymore). The Licor company sells a machine ("odyssey") that uses a scanner to detect IRDye labeled antibodies by fluorescence. Because you can limit exposure, you are unlikely to run into overexposure issues. This technique does not remedy the aspecific binding of antibodies or other antibody related artefacts, of course.
In my experience western blots and quantification should rarely be used in the same sentence. Way too many assumptions have to made when using this method, the most striking of which is that your experimental conditions do not affect the avidity of the antibody for the substrate protein. For example if I treat a cell with compound A, will this result in modification(s) of the protein of interest altering the epitope or access to the epitope that the antibody recognizes.
Even if we assume all things equal, the reliability of any relative quantification of bands on a western blot is dependent on other factors such as the abundance of this protein and the percent difference between each band. I find only very "dramatic" changes (2 fold differences by densitometry analysis) are reliable. Nevertheless its not unheard of to get a 1.5 fold difference consistently.
like anything you probably need a good normalizing control. if you normalize with house keeping genes , and dont overexpose (the band has to be transparent) then I think the relative quantification is good. but you cant compare absolute values between two different proteins only the change compared to control.
in any case the new ECL detection machines (biorad, GE) do a much better job in creating images that can be quantified than the regular film blots.
main thing is to make sure the band is not oversaturated.
this is easy on the digital machines which can detect oversaturation, plus they have a larger linear range of detection compared to film.
for film, best way is to take multiple exposures, and then plot their density on a graph to get the linear range. such quantification gives a pretty good measure of levels, though anything below 10-20% differences are very hard to detect because of inherent variability in a western.
So I have to agree with Barrington, and go a bit further. I would say you should almost never use western blot to perform a quantification assay on anything, no matter what it is. Realistically, a Bradford assay would give you a better quantification of protein concentration than a western blot. This is because the detection you are using in a Western relies on too many variables. You are not only relying on an absolutely perfect pipetting technique to disperse sample into gel wells (which is much more difficult to do with the same precision as one would with a 96-well plate, as with a Bradford assay), but a perfect running of the gel (easy enough), a perfect layering of gel to membrane (which may take some practice), a completely equal exposure of the membrane to antibody (which in honesty would produce negligible differences in intensity, unless it was blatant), an even exposure of the conjugated antibody-exposed gel to ambient light (which can be accomplished without incidence if you are exacting and diligent), no unanticipated or aberrant delays in getting the gel to the exposure apparatus (i.e. the apparatus is right next to where you normally process the gel). the thickness of each and every membrane you are processing is exactly equal (I am not sure if this is physically possible), an exactly perfect cassette to compress the membranes as they are exposed to the film (I am not sure of the precise integrity of such equipment, but I am nearly certain each and every make and model will experience some type of use-related decay in function), no interruptions in your exposure occurs (i.e. no ambient light accidentally graces your cassette while it is being exposed (could happen if the darkroom happens to be host to some students or other people who don't know what they are doing), fingerprints, the integrity of your film, and your own eyes (ability to relatively quantify the intensity of each band, even if they are 1.89 to 2.01 relative intensity). This is a rather hastily derived list. In reality, the number of things that could go wrong with a western is much larger. I would say Western should be used as a rough quantification method used only when the anticipated difference would be very easily detectable and not at all necessary to precisely quantify. In other words, I would almost never use Western to quantify protein concentration. There are so many other less expensive, far more reliable, accurate and precise methods of quantifying protein concentration. A Bradford assay takes a fraction of the time and relies essentially only on pipetting technique and the accuracy of your spectrometer, which in most institutions is never an issue. I would only use western to quantify proteins if there was a large number of independent proteins you wanted to very roughly compare the relative abundance of for preliminary screening (which essentially borders on 2D PAGE).
However, I could also imagine using Western blot if you wish to identify several proteins that are encoded by the same nucleotide sequence but undergo different post-translational modifications, and, for example, you wish to use only one antibody which recognizes all of the proteins. If you customize the gel used in the assay, you could really separate proteins of only slightly different molecular weight and identify them using only one antibody, as long as it recognizes each protein. Again, this only provides preliminary quantitative data that you would have to run further testing on to confirm (i.e. a Bradford assay with repeated results, an ELISA, immunoblot, etc.).
All in all, Western for quantification? Almost never, and never without confirmation. Western for differentiating homologous proteins of differing molecular weight? Yes. Again, only preliminarily. Either repeat the experiments or use a better method of quantification. There are just so many things that could go wrong with a Western.