I have 27 features and I'm trying to predict continuous values. When I calculated the VIF (VarianceInflation Factors), only 8 features are less than 10 and the remaining features range from 10 to 250. Therefore, I am facing a multicollinearity issue.

My work is guided by two aims:

1- ML models should be used to predict the values using regression algorithms.

2- To determine the importance of features( interpreting the ML models).

A variety of machine learning algorithms have been applied, including Ridge, Lasso, Elastic Net, Random Forest Regressor, Gradient Boosting Regressor, and Multiple Linear Regression.

Random Forest Regressor and Gradient BoostingRegresso showing the best performance (Lowest RMSE), while using only 10 features (out of 27 features) based on the feature importance results.

As I understand it, if I face multicollinearity issues, I can fix them using regularized Regression models like LASSO. When I applied Lasso to my model, the evaluation result is not as good as Random Forest Regressor and Gradient BoostingRegresso. However, none of my coefficients become zero when I apply the feature importance.

Moreover, I want to analyse which feature is affecting my target value and I do not want to omit my features.

I was wondering if anyone could help me determine which of these algorithms would be good to use and why?

More Negin Zarbakhsh's questions See All
Similar questions and discussions