07 March 2019 0 6K Report

I'm trying to describe the cross-correlation of a finite length input signal x[n] with the same signal corrupted by white Gaussian noise. If the signal would be infinitely long, the noise would be averaged out by the correlation. However, this is not the case for a finite length signal. I would like to derive a formula expressing the impact of signal length and noise on the output of the correlation. This is how far I came:

y[n] = x[n] + v[n]

x[n] ... deterministic signal

v[n] ... white Gaussian noise

I would like to calculate r_ {xy}[m].

Rxy[m] = E(x[n]y[n + m]) = E(x[n]x[n+m] + x[n]v[n+m]) = Rxy[m] + E(x[n]v[n+m])

And this is where I'm stuck. Calculating the cross-correlation of a deterministic signal with noise. At first I thought that I could pull x[n] out of the expectation, but then I won't get rid of the dependency on n, also if I rewrite it into the sum representation you can see that this is not possible.

Rxv[m] = E(x[n] v[n+m]) =lim(N-->inf) 1/(2N+1) Sum( x[n]v[n+m])

I guess that the product of some signal and zero mean Gaussian noise will again be Gaussian noise with different standard deviation, but how do I write this down?

More David Veit's questions See All
Similar questions and discussions