For given samples X1...Xn, which follow the distribution F with parameters p1,...pm, one may use the maximum likelihood method to estimate p1,...pm. In the technical process, the method is only an optimization problem in m-dimensional field.

In some cases, it is irritating to acquire different estimated values of p1,...pm than the one acquired if one only estimate p1 (assuming this parameter does not depend on p2,...pm). E.g. the joint estimators (\mu, \sigma) may be different than \mu, when estimated alone. Since \mu is reserved for expected value, which one is "correct"? the joint-\mu or the single-\mu?

1) if the joint one is correct, then any single estimator should be doubted.

2) if the single one is correct, then joint-model does not have any values. So one should always choose univariate model.

In some cases, the joint-\mu does not make any sense. The interpretation that \mu is an expected value is somehow no longer valid in a joint-model.

More Christopher Paulus Imanto's questions See All
Similar questions and discussions