01 January 1970 10 3K Report

Let f and g be real functions defined on [0, 1], and denote by λ the Lebesgue measure on [0, 1]. In probability theory, f and g are called independent (on [0, 1]) if λ(f∈I, g∈J) = λ(f∈I) · λ(g∈J) for any open intervals I,J⊆ℝ.

1. It is easy to see that, if f or g is constant, then f and g are independent.

2. I know that, if either f or g are discontinuous, then f and g may be independent.

3. I know how to prove that f and g cannot be independent if they are non-constant, continuous and with bounded variation.

4. I don't know, and would like to: if one requires f and g to be non-constant and continuous, does it follow that f and g cannot be independent? 

Similar questions and discussions