following is possible solution (please read the attached file, by pasting here the table form change. in the tables i did a small comparison between GA, PSO and Simulated annealing.
Tableau 1: comparative study of different optimization solutions (variable form)
N°
Method
Variable (vector or something else)
Variable name
Variable set
Variable set size
1
Genetic Algorithms
(vector or something else)
Chromosom
Family
Nc
2
Particle swarm optimization
(vector or something else)
Particle
swarm
Nc
4
Simulated annealing
(vector or something else)
Neighbor
Unique entity
1
Tableau 2: comparative study of different optimization solutions (implementation type)
In the attached file, you will find a general performance comparison of the LMS, PSO, and GA in terms of complexity, factors affecting their convergence rates, and optimization efficiency.
In terms of computational complexity, GA and PSO are more complex than LMS. But, unlike PSO both GA and LMS needthe selection of appropriate values for step size and control parameters in order to converge at an optimal rate.
In terms of search optimization efficiency, GA and PSO being global optimization techniques are able to locate the global minima of a multimodal error surface. On the other hand, LMS being a local optimization technique fails to do as it can only locate local minima.
Among the two global optimization techniques, GA has more steps than PSO, which increases the required processing time for GA to search for global minima.
In addition, GA weight coefficients are kept in a binary-coded string format, referred to as chromosomes. These chromosomes go through crossover and mutation in every generation before they are updated. Whereas, in PSO particle position and velocity are updated at every iteration to search for the minimum cost and corresponding best solution.