What may be a good, strong and convincing example demonstrating the power of copulas by uncovering some not obvious statistical dependencies?

I am especially interested in the example contrasting copula vs a simple calculation of a correlation coefficient for the original distributions.

Something like this - the (properly normalized) correlation coefficient of components of a bivariate distribution does not suggest a strong statistical dependence between them, but the copula distribution of these two components shows a clear dependence between them (possibly manifested in the value of a correlation coefficient calculated for the copula distribution?). Or the opposite - the correlation coefficient of the original bivariate distribution suggests strong dependence, but its copula shows that the statistical dependence is "weak", or just absent.

Mostly interested in an example described in terms of formulae (so that the samples could be generated, e.g. in MATLAB), but if somebody can point to the specific pre-generated bivariate distribution dataset (or its plots), that will work too.

Thank you!

Similar questions and discussions