Suppose that we have a time series Yt = T+S+e, where as usual T is trend and S is the seasonal component. Now, suppose that we apply a moving average filter to detrend the series, and then focus on the seasonal component. The goal of the analysis is to detect a possible non linearity in the temporal evolution of the seasonal pattern. I think that an easy way to do this may be to regress the dentrended time series on a set of month dummies and their interactions with a polynomial expansion of time. So we test the initial (say at time =0) degree of seasonality by the usual significance test on monthly dummies but at the same time we test the significance of the eventual non linear temporal evolution of by testing the significance of the interaction terms. See for instance the approach of Seiver (1985) and Lam and Miron (1996) to model human birth seasonality. What is your opinion?

Similar questions and discussions