Dear all,

I'm working on (photo)catalytic degradation of organic contaminants. Recently, I noticed that the degradation efficiency was higher at relatively higher initial contaminant concentration (Co = 30 ppm) than at lower concentration (Co = 10 ppm). This was weird as I was expecting higher degradation efficiency at lower Co (keeping the catalyst loading constant). At low Co, the catalyst surface has ample of active sites, and hence should result in higher degradation efficiency. However, I got lower degradation efficiency, and I couldn't explain it why.

I would appreciate it a lot if any one has the explanation for this or could suggest me any literature. I tried to review many articles, but it comes out zilch.

Thank you in advance ^^

Similar questions and discussions