Thank you dear Terans. This attached link was useful for me, I read it, but I didn't have a complete answer for my question. I cant interpret this Table yet...!!!I need to know what AR model and equation should I use for this data set...?
if you listen to the "automatic pilot", and if your model space is restricted to AR(p) processes, then (not too) slow decay of ACF and fast cut-off of PACF beyond lag 2 indeed suggest to give a try to an AR(2)
.
however,the strong negative value at lag 1 in ACF may be a little suspicious
maybe you could also try to integrate your series, recalculate the ACF and PACF of the integrated series and see what happens
with e(t) iid, zero mean and finite constant variance
.
you can obtain the coefficients a1 and a2 from a regression
(there are many other ways)
and you should check that
1- the polynomial 1-a1*x-a2*x^2 has no root on the unit circle (or close to it)
2- that the residuals show a dynamics consistent with the specification of e(t)
.
i do not understand what you mean by "damping" ; obviously your autocorrelation decays very fast with the lag and your partial autocorrelation shows that only a small past window (lag 1 and 2) contributes to the dynamics
this is consistent with AR(2), except for the strong negative autocorrelation at lag 1 which may suggest an over-differencing (therefore the idea of also studying the integrated series, that is z(t) = y(t) + y(t-1)) + ... + y(0)
from my understanding, Prob = 0 means that the p-value of the test is 0 (in this case, it seems to be the Ljung-Box test, because of the Q-stat statistics), which implies that the test hypothesis is rejected at any reasonable confidence level
for the Ljung-Box test, the test hypothesis is that the data is random
therefore, the conclusion seems to be the opposite : the Ljung-Box test rejects the randomness hypothesis at any reasonable confidence level
Please see such as:http://website-quran.stt-mandala.web.id/IT/2521-2408/Correlogram_18103_website-quran-stt-mandala.html
If cross-correlation is used, the result is called a cross-correlogram. The correlogram is a commonly used tool for checking randomness in a data set. This randomness is ascertained by computing autocorrelations for data values at varying time lags. If random, such autocorrelations should be near zero for any and all time-lag separations. If non-random, then one or more of the autocorrelations will be significantly non-zero.
i am not familiar with Eviews and i do not even know how to interpret an ARMA(0,3)(0,0)
(i guess i should read ARIMA but i do not know which parameter is the AR order, the MA order,, the integration order and the seasonality lag)
.
anyway ; consider a simple AR(2) process
y(t) = a1y(t-1) + a2y(t-2) + e(t)
with e(t) a zero mean, s² variance, iid
this process will have the PACF you observe (a sharp drop after lag 2) but will have an exponentially decaying covariance, which you do not observe (you have a sharp negative covariance at lag 1, then nothing)
now consider the differentiated process z(t) = y(t) - y(t-1), with y AR(2) as above
that is, z looks like an AR(2) process but with correlated noise
you still have a sharp drop in the PACF after lag 2
the autocorrelation will likely be dominated by the noise autocorrelation, which is sharply negative at lag 1 (following iid zero mean incerments, if you go up at t, you are likely to go down at t+1) ; this is what you also observe
so, my first guess would be to say that the main component of your process is likely to be a differentiated AR(2) process
.
one easy way to check that is to take your empirical processvalues z(t) and compute the integrated process y(t) = z(0) + z(1) + ... + z(t)
from the above, it should look like a simple AR(2) process (sharp drop in PACF after lag 2 and slowly decaying ACF)
fit the AR(2) to y(t) and analyze the residuals
if you find the residuals are not too far from a zero mean, constant variance, non-correlated noise, then you're done and your process z(t) is just estimated as the differentiated process of the AR(2) process y(t) you have fitted
You are right... I think ARMA model: (0,3)(0,0) because of your statement:" Also if order of AR or Ma is Zero, it is not ARMA. It is AR (if MA order=0) or MA (if AR order=0)..."and,"Choice of AR or MA or ARMA (ARIMA)" My model is MA(3)
according to your statement:" you still have a sharp drop in the PACF after lag 2, and process is likely to be a differentiated AR(2) process" I observe obviously this point on the Table...but I think Now I should choose one of MA(2) or MA(3), I can Test both too...
Moreover, I don't think that have a sharp negative covariance at lag 1, then nothing make a problem in correlation, because it drop in zero in its end...
just to make things clear : my suggestion is that you try a differentiated AR(2)
i do not see any strong sign for an AR(3) : the PACF drops after lag 2 and not lag 3 ; the ACF does not really shows a slow decline with the lag ; anyway, there's no harm in trying !
same thing for MA(2) or MA(3) : the PACF drops much too sharply after lag 2 for a moving average process
.
indeed your ACF has a peculiar structure : i agree with you that it drops to zero at large lags ; this is an indication for a stationnary process and nothing more ; now i do stress that a very significant negative correlation at lag 1 is indeed unusual and is often an indication for a differeniated process
.
you could try both AR(3) and differentiated AR(2) ; they are not very different in terms of complexity so you should not bother with regularization terms such as AIC or BIC
(moreover, since the models are in different classes, not "AR(2) versus AR(3)" but "differentiated AR(2) versus AR(3)", i am not even sure such regularization approach would make sense from a theoretical point of view)