Suppose we are going to analyze a large time series. How many samples should we have to achieve acceptable results?Do you know any formula for determining the sample size before analyzing the time series?
This is a very interesting question. To get the minimum number of samples in a given time interval, you have to determine the frequency components contained in the envelope of the series by deterring its FFT. Then you can see what is the highest base band frequency of this series. The smallest sampling frequency will be at least twice that of this highest frequency. Then you can decimate your series to the new sampling frequency using Matlab decimation tools.
It depends on the type of statistical model being used and on the amount of random variation in the data. Box and Jenkins, recommended a minimum of 50 observations for an ARIMA model. Hanke and Wichern, ( Business-Forecasting ) recommend a minimum 2xs to 6xs depending on the method (where s is the seasonal period). You should read the article whose link is given below.
If the data recorded at every second of time then why are you using the observations at every 4 seconds.
As in previous answer it is mentioned that, if you want to fit ARIMA modelling then minimum of 50 observations are sufficient for analysis. If your theory indicate seasonality in your data set then you should read about seasonality and take minimum 2xs to 6xs observations.
This is a very interesting question. To get the minimum number of samples in a given time interval, you have to determine the frequency components contained in the envelope of the series by deterring its FFT. Then you can see what is the highest base band frequency of this series. The smallest sampling frequency will be at least twice that of this highest frequency. Then you can decimate your series to the new sampling frequency using Matlab decimation tools.
It all depends on how you define time !!! If signal processing then window it else if it is statistical then have a finite set of numbers and annotate .