Optimization problems could be NP-Hard problems, but optimization algorithms could solve these problems. The purpose of this question is to know what is the best algorithm taking into account the computational complexity.
There is very little solid literature on the convergence of these methods. There is no criterion that I know of which could be used as the basis for a termination rule - and that is a very big bug. In math programming we can at least utilize lower bounds in tandem with feasible solutions, in order to reduce the number of candidate solutions.
As there is no systematic strategy in metaheuristics that have anything in common with a criterion on optimality - such as the utilization of upper and lower bounds on the objective value, we cannot come to a conclusion that there is a termination criterion for metaheuristics - except for a complete enumeration. And that is a no-no.
Stochastic optimization algorithms are very problem dependent. For example PSO is pretty difficult to be applied over integer problems. Computational complexity could be proposed, but it will not give you how better is one algorithm than another. It depends what you would like to achieve. A good approach, which you can use is to implement these algorithms and just to measure the time needed for getting solution with acceptable accuracy.
The answer is that no metaheuristic can guarantee that you will identify an optimal solution - as no metaheuristic is actually designed to be able to find one.
Hence, you will not be able to reach an answer to your question.
But one suggestion could be to run one - or a few - such heuristics, and with the best outcome proceed to run a mathematical optimisation method that do converge to an optimum solution. That might be a contribution, as an "advanced start" can be a help to bring down the total CPU time, especially in the case of the use of branch-and-bound code.
The best way to compare algorithms is knowing their time and space requirements through computational complexity theory. But this is very difficult with stochastic algorithms.
PSO is the best algorithm among ( PSO, GA, DE, VNS and EDA)?
Well there you go, then - one algorithm is probably not much better than any other one, if you run a large variety of sizes and coefficients. And that's not surprising at all.
"Computational complexity" is not a method; and you cannot fruitfully analyse a method that does not converge to an optimum. I hope I will never have to engage in work based on metaheuristics; there's little theory (which I am fond of), and you don't know what you get - ever!
meta heuristics algorithms do not guarantee getting the optimal solution because of the randomness in their initialization, and the contribution of many coefficients in the algorithm, the best way to compare between these algorithms is the speed of getting the solution , and the accuracy of the obtained solution, also the nature of the problem also affect on the selected algorithm.
That is not why they fail at all, Ahmed. It's because these classes of methodology do not utilise any real optimality criteria in their specification.
Think of the steepest descent method, as a very basic such tool. Its specification is based on stationarity being characterised by the fact that the gradient of the objective at the stationary point is zero - hence we know the goal. If the gradient is not zero, the search direction is the negative of the gradient at that point. A small step in that direction yields an improvement in the objective value. And yet another iteration is done, until we are satisfied.
In contrast, metaheuristics have not such an optimality characterisation built in its specification - and that's why it almost always fails.
So, the computational complexity of most metaheuristics - or more likely all of them - must be infinity! In laymen's terms: they almost never find an optimum, and they neither find a stationary point.
I guess it depends on your definition of "best". And according to the no free lunch theorem, it also makes sense when that is coupled with the specific problem you are going to solve. What you may do is to look at the time (or iterations) required to achieve a certain obj function value, using tools as data or performance profiles to compare algorithms, on different classes of problems. But again, this is an empirical comparison and the theoretical basis is not as strong as for other areas like mathematical programming.
The no free lunch theorem has been hi-jacked by the metaheuristic community, but in fact you should not invoke it, as mathematical optimisation tools are way, way better on problems that actually appear. (The NFL theorem is hence really a very bad excuse to keep using them; it only talks about the very worst problems, and those problems almost never materialise.)
Very few problems in the world are actually very hard, and most also have a structure that can be utilised to gain a lot of speed. I am not impressed with the meta community: no guarantees of convergence has been established, despite thousands of papers. But there must be very good chances that combinations of meta & optimisation would be very worth-while to try. That avenue should be explored more.
The most accurate answer to the question above is that the computational complexity is infinity: no metaheuristic can guarantee anything. It's not much better than a lottery, in fact. Well, it's a bit better than that .. :-)
The most sensible answer is this - and I have stated it several times:
1. No metaheuristic can guarantee anything - not even a stationary point or a local minimum. So in this sense, you are doomed from the start. You will never know what the optimum is, if you only utilise meta-stuff.
2. If you have a very complicated problem to solve, by all means utilise heuristics, but only if you announce from the start that these methods most probably will not produce an optimum solution, but perhaps one that is good enough for the purpose. And you would need to explain why we can live with a non-optimal solution!
I should think that quite often in practice one does not need to identify a local or global minimum, but one would rather like to find a solution that is rather a lot better than the starting point - and to do even better than that would simply be a nice bonus.
3. But in fact all metaheuristics have a computational complexity that is equal to infinity.
I really urge you to break loose from inferior methodologies, like metaheuristics. Would you not like to know what the optimum is? I would. So start learning at least a basic modelling language (AMPL, for example), and see if you can get also a mathematical optimisation solver that can be linked to the data you have. You CANNOT be satisfied, unless you know that the solution you got actually IS the optimal one - and the only way to know is to solve the problem correctly - that is - to actually find a global optimum.
If you then were to find that most instances of the problem gives similar solutions to those based on the heuristics you use, then that's a bonus. :-)
Abdesslem Layeb The right approach is to learn integer programming, and to use the tools - branch-and-bound, for example. There are plenty of software on the internet. Inferior methods will make you disappointed in the end, so try to use a good integer programming tool.
What parameter? Your topic is "optimisation techniques". Have you any idea how many subtopics there are?? Hundreds. Can't you be a little bit more specific?
PSO, CSO, DE, EHO, ACO, etc. are not mathematical optimisation techniques. They are merely heuristics. They are not devised to actually provide an optimum solution, but "a ballpark figure" - and by that we mean something that can be rather bad indeed.
Before you write a review you need to compare metaheuristics to mathematical optimisation methods - otherwise you are hiding the truth.
thank you for taking time, please it would be better if open it further please recommend related case because i am new at review need your highly guidance and cooperation
That's not how this works. If you are unfamiliar with the concepts in mathematical optimisation, then you must read a few very good books on the topic of mathematical optimisation. Without basic knowledge on the theory of optimality, convexity, and mathematical optimisation methodologies, you will remain an illiterate.
My suggestion is to buy the book by Bazaraa, Sherali, and Shetty, as it has a very nice plan, starting with the convex analysis that you need to understand what an optimum is, and later you will be acquainted with mathematical optimisation tools as well. There are also a rather big set of exercises. And it does start with the very basics - not heavy stuff - so you will not become scared ... :-)
Until we have a new theory, every metaheuristic has a complexity of infinity - since there is no theory developed that establishes that any metaheuristic finds and identifies an optimal solution for a large enough class of problems. Unfortunately.
That does not mean that they are not useful, of course, with the above remark in mind, that you cannot take for granted - for your particular problem - that you can even find a near-optimal solution. And since the methodology is not based on optimality criteria, it is also very hard to assess whether a given vector is optimal or at least satisfies necessary optimality conditions.