In most of the papers, the formula for outdated CSI is given as
h=\rho h* + \sqrt{1-\rho^2} v
where h* is the outdated version of h and v is a random variable with mean zero and variance \sigma^2.
But When I am trying to implement this on MATLAB (Monte Carlo) by varying the value of \rho there is no change in the performance of the system.
Is there anything else I need to include?? Can anybody help?
Edit:
\rho is the correlation coefficient between the current and outdated channel and the value of \rho lies from 0 to 1.
My performance metric is Outage Probability of a system.