I Know that Kalman filtering does time series prediction, so, can we say that there is a relationship between Kalman filter and polynomial regression?or is it entirely different?
1. There is a Difference in terms of optimality criteria
Kalman filter is a Linear estimator. It is a linear optimal estimator - i.e. infers model parameters of interest from indirect, inaccurate and uncertain observations.
But optimal in what sense? If all noise is Gaussian, the Kalman filter minimizes the mean square error of the estimated parameters. This means, that when underlying noise is NOT Gaussian the promise no longer holds. In case of nonlinear dynamics, it is well-known that the problem of state estimation becomes difficult. In this context, no filtering scheme clearly outperforms all other strategies. In such case, Non-linear estimators may be better if they can better model the system with additional information. [See Ref 1-2]
Polynomial regression is a form of linear regression in which the relationship between the independent variable x and the dependent variable y is modeled as an nth order polynomial.
Y=a0+a1x+a2x2+ϵ
Note that, while polynomial regression fits a nonlinear model to the data, these models are all linear from the point of view of estimation, since the regression function is linear in terms of the unknown parameters a0,a1,a2. If we treat x,x2 as different variables, polynomial regression can also be treated as multiple linear regression.
Polynomial regression models are usually fit using the method of least squares. In the least squares method also, we minimize the mean squared error. The least-squares method minimizes the variance of the unbiased estimators of the coefficients, under the conditions of the Gauss–Markov theorem. This theorem, states that ordinary least squares (OLS) or linear least squares is the Best Linear Unbaised Estimator (BLUE) under following conditions: