There is always a lower bound for an unbiased estimator called Cramer-Rao Lower Bound which is tight in the case of Gaussian random vectors. Does any one know any upper bound for minimum variance unbiased estimator?
If our estimator $\hat{\theta}$ is unbiased, we may just bound the mean square error by its variance (because the bias term vanishes) and the variance we may bound by the second moment of the estimator (because $\var{(\hat{\theta})} = E(\hat{\theta}^2)-E(\hat{\theta})^2 \leq E(\hat{\theta}^2)$ ). Maybe that bound is too crude...
Let v ~ Unif(a,b), i.e. uniform in the interval [a,b]. Then y is in the interval [a+x, b+x]. Thus, for a frequentist estimator, the mean error is at most (b-a). (This is not the MSE of course, but the idea should be clear)
But this kind of bounds are usually very lose and hence not practical.
the upper bound that i need should be tight to the variance of the MUVE error (Meanimum unbiased vairance estimator). not just a loose upper bound to a arbitrary estimator.
The upper bound is infinite. Take any unbiased estimator. Now add to it the result of a single independent drawing from the N(0, sigma^2) distribution. The estimator is still unbiased, the variance is increased by sigma^2. Now let sigma become arbitrarily large.
The upper bound for any unbiased estimator may be many and what is its application depends on the case and you may construct it and used it. Lower bound fulfilled the criteria for a estimator to be a good estimator (Unbisedness, consistency, efficiency and sufficiency).