Regression may be useful for understanding existing data, but not alone so for forecasting future data. In this answer, I seek to avoid a taxonomy of techniques that can be found classified in elementary textbooks on time series analysis. In any case, it is perhaps noteworthy that a way to dichotomize time-series analytic techniques would be by time domain or by frequency domain. Another way to classify techniques would be by the type of model and data best fit to the problem as to whether the data are stationary (where the mean and variance are time-invariant) or nonstationary or both.
Moving average models are stationary models, like those popular in stock market analysis, where time invariance of the distribution parameters is assumed; likewise for autoregressive models. Where there is cyclicity in the data one enters the frequency domain, a kind of analysis pioneered by Fourier when he showed that any such function can be decomposed into a trigonometric series. A basic decision factor in choosing a technique is whether data exhibit autocorrelation, that is whether past values of data are useful in predicting present values. I hope this helps.
There is also a large class of nonlinear time series analysis, suitable for coupling and synchronization analysis, testing regime changes, classify dynamics, etc. I can suggest the book Nonlinear Time Series Analysis by Holger Kantz and Thomas Schreiber as a good starting point.
You may want to consider the recursive least squares algorithm (RLS). I have estimated the parameters of the (usual) KLa correlation, which applies to air to broth oxygen mass transfer in aerobic fermenters using RLS with forgetting factor. Such real-time estimates allow for model-based energetic optimization and adaptive control of the fermenter. Useful references to this algorithm can be found quoted at the signalled investigation (cf. pp. 36-37):
Thesis Controlo do Oxigénio Dissolvido em Fermentadores para Minimi...