The reasons could be the controlling parameters of XGBoost.. or the controlling parameters of GA, and PSO algorithms such as the values of Pm, Pc, or C1, C2, and w.
One more thing, it is very common to find a machine learning model with a low performance as compared to other models on a specific case study. Maybe GA and PSO did not help the XGBoost to enhance its performance.
By the way, what is the purpose of using GA / PSO ? what is your optimization problem ? is it "Feature Selection" ?
1) What do x and y represent? Number of features and accuracy?
Ans: X - the value of output; Y - No of observation
2) What is the reason behind using GA and PSO specifically for feature selection?
Ans: sorry it is PCA, not PSO, This is to compare the efficiency (in case of the regression variables dataset) of both; as PCA is used as a benchmark since it is used frequently over the literature
3) Were you interested in using GA for feature selection or hyperparameter optimization?