I want to know what is the best way to calculate the Basic Parameter of GA as crossover, mutation probability and population size? Is there a best numbers for this parameters?
The parameters of evolutionary algorithms, including GA, would depend on the specific problem. So, in the general case, the best way to identify the probability would be to do a sensitivity analysis: carrying out multiple runs of the algorithms with different probability e.g. 0.1 0.2 and different population size etc and compare the outcome.
For tuning parameters, see this: http://www.sciencedirect.com/science/article/pii/S2210650211000022
The parameters also depend on the way you encode the chromosomes. One would tend to use smaller crossover/mutation rates for binary-coded values compared to real-coded values in GA.
Also, as a rule of thumb, a smaller population size is believed to give you quicker convergence speed but the algorithm might be more easily get trapped in local optima. The reverse thing applies to a large population size.
Having said that, if your problem is a benchmark problem already tested by other researchers, you might be able to start from some parameter values commonly suggested by existing researchers. Generally you would not want a very high crossover and mutation probability, and a medium population size.
You can also look at self-adaptive mechanisms. Some review can be found in the book "Introduction to evolutionary computing" by A. E. Eiben and J. E. Smith.
Values of crossover and mutation probabilities depend on problem concerned. For complicated search spaces a higher value of crossover probability (> 0.5) will help searching at the beginning. However, with progress it should be reduced to a value near 0.1 or 0.2. Mutation probabilities normally should be kept very low.( 0.01 - 0.1), otherwise convergence may be delayed unnecessarily.
Engineered GA is a cutting edge optimization method where GA parameters (Such as crossover, mutation etc) rates are self-optimized with Simulated Annealing. Also these rates can be adjusted as the number of generations progress (see Genetic Algorithms with Shrinking Population Size “, (2010), Computational Statistics , 25, 691-705)
I agree with the previous answers in particular for what concerns xover and mutation rate. Sensitivity analysis is always a good way for determining those parameters, and there are also many self-tuning techniques for adjusting them on the go. I am not a fan of meta-optimisation, i.e. the use of another optimiser (e.g. SA) for optimising the parameters of the employed one (GA in this case), because then you could argue that you have the problem of tuning the parameters of SA. This results into an endless loop in many cases. What I would like to add regarding the population size is that on top of the classic considerations (a small population converges faster, a large one better handle noisy fitness functions etc...) we also need to take onto account that a too large population size will affect the ability of your GA to equally explore the search space. If you have a large decision space, and you know you need to explore it as much as possible (e.g. because the problem is highly multimodal), then use a fairly small population size. The recent study in
has shown how increasing the population size also increase the structural bias of the GA. This means that the search will be biased towards specific regions of the search space while other will be ignored. This is bad when exploration is needed. Hoe this helps.
Article Structural bias in population-based algorithms