Hello everyone,

When you run a metaheuristic, why the run time of the algorithm when run for 1000 iterations for example is not roughly double the run time for the same algorithm when run for 500 iterations. I get some times triple the CPU time! Is there any logical explanation for this?

More Manar I. Hosny's questions See All
Similar questions and discussions