Suppose I am optimizing ZDT-1 2 objective test function. I want to stop the algorithm when there is no significant improvement in Pareto front. how can I achieve this?
Then, when the approximation to the utility function used (for example the Tchebyshev distance from the ideal solution to the Pareto front) it doesn't improve lapsed a previously established (reasonable) number of iterations
Supposing that you can write/change the programming code of your computer program or computer program that you are using:
You can firstly test, improve your algorithm and adapt its parameters for a little example where you can find deterministically the optimal Pareto Front, for example by an enumeration model (probably exponential) if the space domain is discrete. For larger problems you can end the run in a similar way to what is normally done for a conventional optimization algorithm with a single objective function, for example when there is no improvement (punctual or in average) of the objective function value, with the necessary adaptations given that all objective functions must be considered simultaneously now.
It depends on the problem you are solving (Is it linear? Integer?) and the method you are using. For linear problems, using a "weighting" method, i.e., a linear combination of the objectives, you can find all solutions easily. If the problem is integer, you can have solutions that are not found by the same method, and you may have to use a "constraint" method, in which you transform all objectives but one in constraints. For example, if you are maximizing Z_1 and Z_2, you can make Z_2 >= A, and move A over the whole possible range of values.
If you are using a scalarisation/decomposition based method, the answer is straightforward: you can stop when there is no sufficient decrease in the scalarised objective function.
If you are not using a scalarisation/decomposition method, you'll have to either use some kind of convergence metric (GD, hypervolume, averaged Haussdorf distance, etc) to make the same decision as above, or invent your own criterion (for example, the Directed Search Method of Oliver Schutze uses the condition number of the Jacobian of the objectives).