When the absorbance = 1.00. Generally, this is the top end of your linearity curve where the coefficient correlation becomes nonlinear, since A = abc (Beer's law).
I dont think it is always so obvious so as to say that A = 1.00 is the limit value. It depends on many factors, mainly concentration of the analyte containing the chromophore. It was demonstrated that concentration higher than 0.01 M causes deviations from Bouguer-Lambert-Beer Law (aggregation of molecules, interactions with solvent molecules etc.). There are also electronic noise properties (particularly Johnson noise) of each spectrophotometer´s modules, particularly the detector. So it depends whether there is a CCD, diode array or a phototube. Using proper detector (characterized by lower noise) you can widen the optimum absorbance interval to, for example, 0.1-1.8. For more information, you can check the book from L. Sommer: Analytical Absorption Spectrophotometry in the Visible and Ultraviolet (1989).
Optimal optical density (OD) values for reliable measurements must be lower then 2. Depending on the manufacturer of the UV vis spectrophotometers, more acceptable OD value may be between 0.1 and 1.5
A useful information can be found in the paper by Myers et al. BMC Biophysics 2013, 6:4; http://www.biomedcentral.com/2046-1682/6/4, which shows that scattering can reduce the range of linear behavior of OD, which is also dependent on wavelength and spectrophotometer used for the measurement.
If a sample is giving the the OD=3, it means the transmission of light through the sample is only 0.1% . It can also be understand in another word that only 1 Photon of 1000 photon has been detected by detector. For the OD= 1 and 2 the transmittance of light through sample is 10 and 1% respectively .
For most of the spectrophotometric samples, O.D. should be 1 or less than 1 but it can be up to 3. If a sample has O.D. greater than 3 this means only 1 photon out of 1000 will be detected by the detector. Even in the most sophisticated instrument this small amount of light is very hard to accurately detect above background noise. Therefore measurement above 3, O.D. will have greater error and will in turn less accurate. Therefore it is always recommended to dilute the sample having O.D. greater than 3.
Could anyone who is using Genisys 20 Thermosci Spectrophotometer, tell me that by what factor do you multiply to get the answer in cell/ml ? I normally rely on 1.3 x 10^7. But some suggest using 1 x 10^8. Which one should be good for this specific instrument ? Thanks !