Let x=(x_1,...,x_n)' and w=(w_1,...,w_n)' are two n-component random vectors. w is independent of x. x has a standard normal distribution (N(0,I_n)). Why the following equality is true:
I believe, by (w'x) you mean the inner product of two vectors, am I right?
Now, remark that if w and u are fixed (not random) vectors of the same norm, then by symmetry E{abs(w'x)} = E{abs(u'x)}. Taking u = ||w|| (1,0,...,0) we get for a fixed (not random) vector w the formula E{abs(w'x)}= ||w||E{|x_1|}.
Consider now the case of random vector w. Since x and w are independent, we can consider them defined on the product of two probability spaces T and S, with x being dependent only on t \in T, and w dependent only on s \in S.
By the Fubini theorem, E{abs(w'x)} (which is a double integral over T \times S) can be computed integrating abs(w(s)'x(t)) first over T with s \in S fixed, and then integrating the result over S. But, when s is fixed, w(s) is a fixed vector, and we know that \int_T abs(w(s)'x(t)) = E{abs(w(s)'x)} = ||w(s)||E{|x_1|}. Finally, the integration over S gives the desired formula.
Why should E{abs(w'x)} = E{abs(u'x)} be true for fixed vectors of the same norm? Take x=(0, 1)', w=(0,1)', u=(1, 0)'. The LHS is 1 and the RHS is 0. What did I miss?
To my understanding the answer from Vladimir Kadets works, if we apply the following modifications:
Let us only fix w and still consider x to be the random variable having standard normal distribution. Then E{abs(w'x)} = E{abs(u'y)}, where u=Nw and y=Nx for a orthonormal matrix N such that u = ||w|| (1,0,...,0)'. Notice (or lookup) that y has as well standard normal distribution, i.e. the same distribution as x. Therefore we still get E{abs(w'x)}= ||w||E{|y_1|} = ||w||E{|x_1|}.
Since N is not present in the equation, we can still continue as before examining the double integral.
(With these fixes: Nice straight argument, indeed.)
Please, read carefully my answer. I have written that u and v are fixed vectors. About x I did not say this! x is, of course, a random vector, which has a standard normal distribution (N(0,I_n)). Your further explanation is correct. It is exactly what I meant when I wrote that E{abs(w'x)} = E{abs(u'x)} "by symmetry".
The validity of the equation depends on the norms applied on the RHS.
For a counterexample for the \ll_1-norm, one may take n=2, w_1 = 1 w_2 = 1 (these are independent of anythig:) If so, then for y := w' x = x_1 + x_2 the desired expectation is equal to
E|y| = E |x_1 + x_2| = \sqrt{2} \cdot E(|x_1|)
whereas
E( || w ||_1 ) = 2 and therefore E(|y|) \ne E( || w||_1 ) \cdot E(|x_1|)
Moreover, for the \ell_2-norm one can complete the Vladimir proof by making the observation that for any deterministic coefficients w the probability distribution of \sum_j w_j x_j and ||w||_2 \cdot x_j are equal (due to stability of the normal standard distribution of order 2 ). Continuing with other stable p.d. we can get e.g. the following.
If w and x are independent, and if all x-s are iid with centered Cauchy p.d., then
E |w' x| = E (|| w||_1 ) \cdot E|x_1|
(since then p.d. of w' x and || w||_1 \cdot x_1 are equal centered Cauchy distributions).
These are IMPORTANT complementary remarks (possibly interesting for someone:)
1. The Cauchy p.d. does not possess expectation unless concentrated at a point, so the only true claim is better if formulated as follows:
If w and x are independent, and if all x-s are iid with centered Cauchy p.d., then the p.d.s of w' x and || w||_1 \cdot x_1 are equal centered Cauchy distributions, in particular E |w' x| = E (|| w||_1 ) \cdot E|x_1| (note: then E|x_1| equals 0 or \infty, and so, the equality looks useless)
2. The extension to other stable p.d.s holds similarly with the exponent p of stability within the interval (1, 2]. The proposition reads as follows:
If w and x are independent, and if all x-s are iid with centered p-stable p.d. where1