I have 10 years monthly returns. I calculated annualized return multiplying the mean return over the period for 12. Then I calculated the excess returns as difference between the annualized mean return and the benchmark annualized mean return (no risk correction). Annualized standard deviations are obtained multiplying standard deviations for sqrt(12). Data points = n are the same. Now I want to test the hypothesis the difference of the two means (excess return) is greater than 0. Should I use annualized data or not? If I consider annualized data, also variances should be annualized, but the resultant t stat is different depending on the way I calculated it (annualized or not).

Mean is 0.0033 and sample std deviation is 0.0225, benchmark mean is 0.0065 and benchmark std deviation is 0.0197 (not annualized data). I have 120 data points (10 years). I want to test (0.0033-0.0225) or (0.0033*12-0.0225*12) significance. Can you give me the formula I should use in this case and also the result? Sould I assume variances are different?

I'm using Matlab. Thank you in advance for your answer.

Similar questions and discussions