I am researching trading strategies, particularly involving time series analysis, but also machine learning, on data sets that range as far back as 1950 and would like to compare the accuracy of various models, up to ~25 at a time.
My for loops in R for an ARIMA + GARCH model of the S&P500 values starting from 1950 to today take about 10 hours to evaluate all models with p and q both ranging from 0 to 5.
My specs are an AMD Intel i5-8600K, 16GB of DDR4 RAM, and an AMD R9 Fury, but I am not using any parallel processing.
Thank you.