When would you naturally expect OLS regression to be completely appropriate?

As the size measure related to the independent variable y gets larger, one would usually expect y to tend to have larger variance, and thus larger standard error.  To illustrate: If one estimates, or here, 'predicts' 1,000,000  +/-  20,000 then that may be reasonable.  But would you then also expect to predict 100,000  +/-  20,000?  What about 1,000  +/-  20,000?  One would expect heteroscedasticity in the error structure.   So, when would one ever expect a naturally homoscedastic relationship?  What application comes to mind, starting with simple linear regression?   

I don't mean one where you 'test' to see if heteroscedasticity is apparently not large. There are methods for actually estimating heteroscedasticity, so such tests, which would not even be meaningful without a power analysis or other sensitivity analysis, are not really useful, if you can estimate a coefficient of heteroscedasticity, or use a robust value for the coefficient of heteroscedasticity.

And I don't mean an application where you try to transpose to hopefully reduce heteroscedasticity.

-------------------------------

Thus the question is, Under what circumstances would you have a naturally occurring homoscedastic regession application?

Thank you.  

Similar questions and discussions