I ran an experiment with a ref gene (3 biological repeats x 3 technical repeats) and normalized gene expression of my target genes to the ref gene. I decided to present the data as log2 fold changes, which is, in my opinion, a better visualization of this kind of data (and of course, it helps with data normalization). Problem is, I can't find how to calculate the standard error of normalized, log transformed data anywhere. My solution was to calculate a 95% confidence interval and convert the range to log2. My question is, is this solution sound, and is it acceptable to present data in this manner? and how can I calculate the SE of this sort of converted data.
Thank you