Im working on improving the performance of neural networks. I need a best and state of the art searching techniques for find optimal activation function
In my opinion this strongly depends on the type of architecture and layers you want to combine. On the other hand activation functions behave different concerning the gradients (first derivative) resulting in different gradient flows and also convergence behaviour. I would recommend to search in literature for similary networks and what activations they used. It is strongly an experience thing.
Dear Vijayaprabakaran Kothandapani , what you really need is an automated mechanism for searching for the best combination of activation functions and layer architecture for your specific problem. There are two approaches that have been used recently:
Neural Architecture Search (NAS) - NAS is an algorithm that searches for the best neural network architecture. In the NAS algorithm, a controller Recurrent Neural Network (RNN) samples these building blocks, putting them together to create some kind of end-to-end architecture. This architecture generally embodies the same style as state-of-the-art networks, such as ResNets or DenseNets, but uses a much different combination and configuration of the blocks. This new network architecture is then trained to convergence to obtain some accuracy on a held-out validation set. The resulting accuracies are used to update the controller so that the controller will generate better architectures over time, perhaps by selecting better blocks or making better connections. Look here to know more: https://towardsdatascience.com/everything-you-need-to-know-about-automl-and-neural-architecture-search-8db1863682bf
AutoML: Google's answer to NAS, instead of designing complex deep networks, we’ll just run a preset NAS algorithm. Google recently took this to the extreme by offering Cloud AutoML. Just upload your data and Google’s NAS algorithm will find you an architecture, quick and easy! Look here: https://cloud.google.com/automl/
You want the code for a NAS implemetation instead of trying to use AutoML on Google's cloud?
Below are some implementations:
A collection by GitHub: https://github.com/topics/neural-architecture-search