You can use two methods in integrating a metaheuristic algorithm with a NN. The one is using meta heuristic in finding appropriate initial weighs, or finding initial parameter of NN like mu and sigma in RBF NN, and the rest can be find by BP method in NN. The next can be finding all parameters of NN by a metaheuristic algorithm. If you are looking a good meta heuristic method, instead of using PSO, cuckoo and etc, I propose Stochastic Fractal Search (SFS) algorithm. In my paper, SFS can overcome all well-known algorithm. You can find the algorithm and the matlab code freely in the internet.
You can use two methods in integrating a metaheuristic algorithm with a NN. The one is using meta heuristic in finding appropriate initial weighs, or finding initial parameter of NN like mu and sigma in RBF NN, and the rest can be find by BP method in NN. The next can be finding all parameters of NN by a metaheuristic algorithm. If you are looking a good meta heuristic method, instead of using PSO, cuckoo and etc, I propose Stochastic Fractal Search (SFS) algorithm. In my paper, SFS can overcome all well-known algorithm. You can find the algorithm and the matlab code freely in the internet.
You can have a look here https://www.researchgate.net/publication/220846526_Protein_feature_classification_using_particle_swarm_optimization_and_artificial_neural_networks
to get some idea.
This is for classification.
Conference Paper Protein feature classification using particle swarm optimiza...
To complement the previous answers, my advice is the following.
First, I think you need to define the problem well. What do you mean "enhance images"? What is the purpose of the neural network (NN) in this setting? What would you like PSO to do?
When you say "BP neural network" are you implying that backpropagation (gradient calculation) and gradient descent will be used, alongside PSO? If yes and the problem is deterministic and you are optimizing the weights, then you could follow a "lamarckian evolution" approach where the weights of the NNs *after* learning (gradient descent) are kept in the population. Alternatively, you could use a "memetic" approach where global search is done with PSO and some local search is done with gradient descent.
If you want to optimize something other than a vector of continuous values of the NN (and/or its learning algorithm) such as the connectivity or the structure of the NN, then you need to augment standard PSO with other operators that are compatible with the representation you are using.
I hope these help. Again it all comes down to the specifics of the problem. If you could provide some more information I could perhaps offer some better guidance.
I will ignore what the application is? "Image Enhancement" or something else. BP neural networks is like many learning methods have parameters, like number of hidden layers, number of inputs neurons, number of neurons in each hidden layer, learning rate, etc.. In the other hand PSO is an optimization algorithm, so we can use it to optimize learning parameters in BP. I need to know what kind of images you want to enhance to give more details about your case.
Hello Joseph. Thanks for the feedback. Actually, I described two hybrid systems above. I apologize if there was any confusion. These systems are:
1) The "Lamarckian" is the one where evolution selects an individual, then the individual learns during its lifetime by modifying its weights, and this acquired knowledge is kept in the next generation.
2) The "Memetic" system might be the one you are referring to, where evolution selects an individual, then the individual learns during its lifetime, but this acquired knowledge is not kept in the next generation. Using such a system one might observe the Baldwin effect as you noted.