Hello everyone,

for my master thesis, I've been testing a non-traditional method for sulfite determination via photometry. It's based on the reaction of sulfite with fuchsin reagent and formaldehyde (Pachmayr, 1964).

I made four "samples" (standard solutions) with varying concentration of sulfite and was trying to observe the continous oxidation of sulfite to sulfate (when sulfite oxidizes, its concentration decreases and the colour intensity and absorption decreases...). Thus I made a series wavelength scans of the solutions over couple of days.

However, I realised that the extinction maxima are moving. If I measure series of standards, not all the maxima are found at the same wavelength. Additionaly, the longer I let the solutions stand, the more are the maxima moved to smaller wavelengths.

To give an example, the first series ("t=0")

c (SO32-, t=0)............Lambda (max. ext.)

100 microg/L............578 nm

500.............................584

1000..........................584

1600..........................584

and at the t=x, the extinction maxima of the standards change respectively to 576, 577, 578, 579 nm.

I need to create the relationship between the sulfite concentration and its extinction to quantify the oxidation of sulfite and this is confusing me as I do not know, which wavelength I should choose for calculations.

I cannot find any literature that would guide me here. If you have any ideas or explanations, literature, please, guide me.

Best regards

Similar questions and discussions