In doing colorimetric sensing using spectrophotometer, the replica (n=11) of a specific sample at a given concentration may produce large standard deviation in absorbance. How to control such errors?
It may depend on the details of of your spectrometer and the placement and type of sample. Usually we think T+R+A= 1 with T, transmission, R, reflectance, A, absorption, but really T+R+A+S =1 is more accurate with S = scattering. And the scattered light bounces around the instrument and is not collected by the detector. So if you have strongly scattering samples you might want to investigate that aspect.
In many case you have a double beam spectrometer, and in that case you may use a reference sample, or take a reference measurement first. You might think that the reference sample or reference measurement is only measuring reference sample, but it really also measuring the apertures and physical design of the instrument. This is problematic, because in some cases this essentially a measurement of how you place the sample in the instrument.
Depending on how you set the slit widths and the size of any apertures on the sample impact both the wavelength resolution and the signal to noise ratio. So you can trade that off some with wider slits you may reduce your variation from measurement to measurement.
Making really precise absorption measurements can be tricky, especially at high absorbances. My guess that for a lot of work, people just set their instrument up in a standard configuration, then try to put their samples in standard way, then are really just tracking a relative result from one measurement to the next. The precise value of the measurement may not be exactly correct, but one can follow trends.