I believe this problem is quite universal with many heuristics and metaheuristics. In general, the problem domain should guide the choice and it should be well documented in the literature. As a result, any results can be easily compared against the state of the art.
Another method is to a relative error. This metric measure the distance between the known minima or global optima. It helps a lot with problem that have a huge variation of numbers between instances. The TSP is one example.
There is no appropriate method to find a fitness function for the problem that you are trying to optimize, you have to design a function that suits your problem. The best way is to choose a fitness function by analyzing your problem and then run the GA and keep iterating.
There is not such as thing. You use genetic algorithms because the lack of a fitness function, or because you have a very hard-derivative function, or a group of them where an optimal is requested, and where the analytical methods, such as Newton-Raphson, could not be implemented.
Contrary, may be you could be interested in trying a new strategy to speed up the process of finding the optimal using genetic algorithms and for this reason you could need define a fitness function to evaluate your strategy, or an specific improvement, against other methods. Some authors have used them as a reference or as a base to get a measurement of the improvements of their proposals.
If you really are interested into get a fitness function from a series of genetic algorithms results, you could try to select data with promissory results and to fit them to a predefined function. To do this you may need to do a computer experimental design using optimal designs in order to get a better chance.
You should remember that Genetic algorithms are part of heuristic strategies of searching, and that they are open to experimental improvements applying your own expertise in the field where you want they be applied.
You generally go in the other direction. You have a problem, of which, you have the objective function. Not necessarily a closed function, not to mention derivable. Once you have the objective function, you apply GA, PSO, DE, or any metaheuristic optimization method to optimize.
If you do not have the fitness function, then try a polynomial combination of the involved variables. Take a loook at the book Practical Genetic Algorithms, by Haupt and Haupt. They devote a section on that.
If you are just starting GA and want to try with several fitness functions, take a look at:
I faced the same problem. The solution is to put the desired output or output and then start formulating and modifying a function until a fitness function is appropriate for the issue.
function y = simple_fitness(x) y = 100 * (x(1)^2 - x(2)) ^2 + (1 - x(1))^2; The Genetic Algorithm solver assumes the fitness function will take one input x where x is a row vector with as many elements as the number of variables in the problem.