Let Z1 and Z2 be two Gaussians with mean 0 and variance 1 such that Cov(Z1,Z2) = -0.5. Let nu10 and q>0 (q^2 = approx 2). I'm looking at the conditional probability

P((Z1+nu1)^2>q^2 | Z1+nu1>0, Z2+nu2>0)

I would like to prove it is increasing as nu2 increases.

The conditional probability could be simplified into

P(Z1>q-nu1, Z2>-nu2) / P(Z1>-nu1, Z2>-nu2)

I'm very grateful for any thoughs about this.

Similar questions and discussions