Dear fellow Researchers,

I am wondering what kind of difference in terms of the standard deviation and the coefficient of variation should I expect between an hourly time series with a relatively big mean (let say 900 MW) and one exhibiting low mean power demand - like 50 MW? 

I am aware that this will be strongly dependend on the consumer type (residential, commercial, industrial etc.) but lets assume that we are considering a proportional mix of them.

So far I have been working on two time series. One with a mean equal to 20.5 GW, and the second one close to 1 GW. Their coefficients of variation were respectively 15% and 20%. So we can say that they were quite smooth.

Thank you in advance for your comments and help on that matter. 

Similar questions and discussions