Im trying to build a VAR(p) model with autocorrelation, even though among the assumptions of the VAR model is the error term 's autocorrelation should be zero. Is there a way to build such a model by ignoring that assumption?
The p in VAR(p) stands for the number of lags modelled, the model itself does include some short-run autocorrelation in the output vector subject to stability (the model outputs will be stationary and therefore cointegrated).
For more "random walk" behaviour you could use VARMA (Vector-ARMA) which has an additional moving average component.
Calibration and overfitting are still potentially tricky problems but essentially the "trend" is captured in the moving average bit of the model first and the "short-run autocorrelation" is estimated from regressions of detrended data with lags.