We are working on the hyperparameter optimization of neural networks so for more exposure latest optimization techniques needed for experiment. Anyone kindly suggest me latest optimization algorithms.
There are many techniques proposed for hyper-parameter optimisation in neural networks. In practice, Grid Search, Random Search and Bayesian Search are widely used for hyper-parameter optimisation.
Grid Search is applicable only when there is few parameters are need to be optimised. Because it validates all the parameter values individually on each trial which is a time consuming process.
While, Random Search is one-step ahead of Grid Search. It takes the values from the uniform distribution of values for a particular parameter and find the optimal one.
Bayesian optimisation is based on the Bayesian theory. It first find the prior probability for the value of hyper-parameters and optimises the value based on the probability estimation.
All these optimisation are used to select the best hyper-parameter based on the given set of values to each hyper-parameter but in evolutionary based optimisation of hyper-parameter can evolve new set of hyper-parameters with different combinations.
Many approaches were proposed over the years, the most popular approach is NEAT and many variants like Hyper-NEAT and HA-NEAT also introduced for hyper-parameter optimisation. This application of evolutionary algorithms on neural networks to evolve various topologies, weights, activation and other hyper-parameters can be known as Neuroevolution.