We had a blue colored solution whose lambda max should be around 750 nm. At low concentrated solution (100ug/ml), absorbance value upto 0.7, the lambda max was observed around 740 nm.

But in the concentrated solution, absorbance value greater than 1, lambda max shifts to 640 nm and further decreases with increasing absorbance value.

What is the reason behind these shift  in lambda max at higher concentration?

We are trying to make calibration curve, and wavelength should me measured at lamda max. But, since the lambda max shifts in higher concentration, we are having problem. Should we not consider absorbance value greater than 1?

what may be best solution here? 

Similar questions and discussions