Hello,

I'm writing paper and used various optimizers to train model. I changed them during training step to get out of local minimum, and I know that people do that, but I don't know how to name that technique in the paper. Does it even have a name?

It is like simulated annealing in optimization, but instead of playing with temperature (step) we change optimizers between Adam, SDG and RMSprop. I can say for sure that it gave fantastic results.

P.S. Thank you for replies but learning rate scheduling is for leaning rate changing, optimizer scheduling is for other optimizer parameters, in general it is hyperparameter tuning. What I'm asking is about switching between optimizers, not modifying their parameters.

Thanks for support,

Andrius Ambrutis

Similar questions and discussions