I was wondering, in SPSS, when working with the general linear model, is it possible to select entry method of variables ? I want it be backwards but I can't seeme to find ut.
Maybe it's worth trying to force the regression subcommand '/METHOD=backward' into your GLM's syntax and see what happens (sometimes, when one function under SPSS's hood is built to inherent from another one, those things miraculously work). But since the METHOD command in GLM seems to be reserved for the selection of the sum-of-squares method, this will very likely give you an error message (https://www.ibm.com/docs/en/spss-statistics/24.0.0?topic=univariate-method-subcommand-glm-command).
btw: problems like these are why I switched to R some time ago - something I can recommend for anyone doing statistics!
If you have only a small number of variables to evaluate, you could run the full model, look at parameter estimates (with CIs) for the IVs of interest, and then decide whether to jettison the IV associated with the "weakest" contribution, based on whatever criterion you embrace, then re-run a reduced model. Continue until all remaining IVs meet your criterion for retention, or until you've given up on all of the IVs.
By the way, backward elimination doesn't guarantee that you'll wind up with the best ensemble of IVs in a model. Have a look at some more modern methods of variable selection (such as lasso).
Be warned that the p values printed for these methods are likely not what you want. Several packages can be used for the lasso. A popular package is glmnet (https://glmnet.stanford.edu/articles/glmnet.html) which is implemented in R/S, Python, and Matlab. If I remember corrected SPSS used to have code to run ridge regression, which constrains the sum of the squared betas rather than the sum of the absolute values, so lacks some useful properties of the lasso (like betas going to zero). A great book on the lasso and related techniques is freely available at: https://hastie.su.domains/StatLearnSparsity_files/SLS_corrected_1.4.16.pdf