In time series modeling and volatility estimation it is necessary, first remove autocorrelation of time series and after that estimate the volatility model (like GARCH).
the autocorrelation estimate by ACF test, but in some situations (like a low sample data or noise,...) maybe this procedure causes bad estimation of autocorrelation.
for example the true model is AR(3)-GARCH(1,1) but we used AR(1)-GARCH(1,1)
are the GARCH parameters biased in this situation?
Thanks in advance.