11 June 2024 2 2K Report

Can SHAPELY (SHAP) values be used to explain the importance of different features being fed to a Neural Network ? I know they are used in Traditional ML on Tabular data

More Titas De's questions See All
Similar questions and discussions