The algorithm's parameters influence the performance of the algorithm, so I would like to enquire about the best way of fixing the metaheuristic parameters as genetic algorithm and its parameters (crossover, mutation..)
For me I think that you can use the default parameters first, because the default parameters have been set base on a high number of tests and scenario.
After using the default parameters, you can then try to study the convergence time or speed, if you need to have a fast convergence, you can change the parameters value one by one and see if It can improve something in terms of result quality, algorithm convergence time and so one, and then smoothly you can definitively have your parameters values.
Your question is an important and interesting one. If we consider Hybrid Genetic Algorithm as an example one way of fixing for example the % of crossover, % of mutation number of iterations is by trial and error procedure. For more details I recommend reading an important book about this issue. The following is the book:
K. Dejong, Evolutionary Computation: A Unified Approach (MIT Press)
I agree with advises by Eric Deussom and M. Awad, and suggest further reading in case you need to optimize convergence time. You can refer to literature on parameter tuning, if you need to optimize parameters before running the algorithm, or on parameter control, if you want to adjust parameters online, that is during algorithm run. Here are two interesting papers on both topics:
1) Eiben, A.E., Smit, S.K. (2011). Parameter tuning for configuring and ana-
lyzing evolutionary algorithms. Swarm and Evolutionary Computation, 1(1), 19-31.
2) Eiben, A.E., Hinterding, R., Michalewicz, Z. (1999). Parameter control in
evolutionary algorithms. IEEE Transactions on evolutionary computation,
We have addressed similar problems in these research papers.
A self-tuning modified firefly algorithm to solve univariate nonlinear equations with complex roots (http://ieeexplore.ieee.org/abstract/document/7743964/?reload=true)
Abstract:
Existing numerical methods to solve univariate nonlinear equations sometimes fail to return the required results. We propose a modified firefly algorithm [MOD FA] with a self-tuning ability to solve a given univariate nonlinear equation. Our modification is capable of finding almost all real as well as complex roots of a nonlinear equation within a reasonable interval/range. The modification includes an archive to collect best fireflies and a flag to determine poorly performed iterations. It is also capable of tuning the algorithm-specific parameters while finding the optimum solutions. The self-tuning concept allows the users of our application to use it without any prior knowledge of the algorithm. We validate our approach on examples of some special univariate nonlinear equations with real as well as complex roots. We have also conducted a statistical test: the Wilcockson sign rank test. By conducting a comparison with the genetic algorithm and differential evolution with same modifications [MOD GA] [MOD DE] and with the original firefly algorithm [FA], we confirm the efficiency and the accuracy of our approach.
A self-tuning Firefly algorithm to tune the parameters of Ant Colony System (ACSFA)(Article A self-tuning Firefly algorithm to tune the parameters of An...
)
Abstract:
Ant colony system (ACS) is a promising approach which has been widely used in problems such as Travelling Salesman Problems (TSP), Job shop scheduling problems (JSP) and Quadratic Assignment problems (QAP). In its original implementation, parameters of the algorithm were selected by trial and error approach. Over the last few years, novel approaches have been proposed on adapting the parameters of ACS in improving its performance. The aim of this paper is to use a framework introduced for self-tuning optimization algorithms combined with the firefly algorithm (FA) to tune the parameters of the ACS solving symmetric TSP problems. The FA optimizes the problem specific parameters of ACS while the parameters of the FA are tuned by the selected framework itself. With this approach, the user neither has to work with the parameters of ACS nor the parameters of FA. Using common symmetric TSP problems we demonstrate that the framework fits well for the ACS. A detailed statistical analysis further verifies the goodness of the new ACS over the existing ACS and also of the other techniques used to tune the parameters of ACS.
The F-Race method is based on a statistical approach for selecting the best configuration out of a set of candidate configurations under stochastic evaluations. There's also a family of extensions to this method called iterated F-Race, that is used not just in genetic algorithms, but in many evolutionary algorithms (e.g. Incremental Ant Colony Optimization with Local Search). I'm attaching an overview chapter about these methods.
There are also many other methods to achieve this, and that's why I'm also attaching a paper that describes various methods used to accomplish the task of analysing and optimizing evolutionary algorithms.