Between latest it is Integration of Variables Method. In it frame was developed by me a group of concrete heuristi cs. Really in it frame it is possible to create a very wide set of different algorithms matching to different concret problems. From my researchgate profile you can dowload by example "The Integtration of Variables method: a generalization of Genetic", "Conciliación óptima de la distribución de piezas de configuración irregular y su corte "Optimal conciliation between distribution on the foil of irregular configuration pieces and it cutting", "Analisys and Synthesis of Engineering Systems", "Un algoritmo del Método de Integración de Variables para la solución del problema Máximo Clique Ponderado" and others deveted devoloving and using this metheod,
Not only have I proposed a new metaheurisitc, I have also cited Sorensen's explanation for why there are too many of them.
What makes Leaders and Followers a worthwhile addition to multitudes of "new" metaheuristics?
1 - It is based on how solutions are compared, not how they are created. Most metaheuristics (e.g PSO, DE, ACO, GA, EDA, etc) focus on how solutions are created. Only a few (e.g. Simulated Annealing and Tabu Search) focus on how solutions are compared. Unlike Simulated Annealing and Tabu Search, Leaders and Followers is a population-based method.
2 - Leaders and Followers is extremely simple -- arguably simpler than PSO, DE, ACO, EDA, etc. Many new metaheuristics are more complicated than existing metaheuristics. In general, any addition to a metaheuristic can improve it's performance (witness the thousands of papers produced each year). Getting improved performance with a more complicated algorithm is no big deal. Getting improved performance with a simpler algorithm? Now that gets interesting!
3 - Leaders and Followers is compared against well established versions of PSO and DE on a full benchmark (CEC2013) of problems. Many introductions of metaheuristics (e.g. PSO) don't demonstrate their performance against legitimate competition.
Hope that helps.
Cheers,
Stephen
Article Metaheuristics -- the metaphor exposed
Conference Paper Leaders and Followers – A New Metaheuristic to Avoid the Bia...
Another recently developed metaheuristic is Minimum Population Search (MPS). This metaheuristic is specifically designed to optimize real-valued, multi-modal and high dimensional functions.
Conference Paper Minimum Population Search – Lessons from building a heuristi...
Conference Paper Extending Minimum Population Search towards Large Scale Glob...
Conference Paper A Minimum Population Search Hybrid for Large Scale Global Optimization
"Sea Lion Optimization Algorithm" is one of current-state-of-the-art meta heuristic algorithms . More info can be found here:
N. K. T. El-Omari, "Sea Lion Optimization Algorithm for Solving the Maximum Flow Problem", International Journal of Computer Science and Network Security (IJCSNS), e-ISSN: 1738-7906, DOI: 10.22937/IJCSNS.2020.20.08.5, 20(8):30-68, 2020.
Or simply refer to the same paper at the following link:
Article Tuning metaheuristic algorithms using mixture design: Applic...
Article An efficient two-step damage identification method using sun...
Article A sunflower optimization (SFO) algorithm applied to damage i...
Lichtenberg Algorithm (LA)
Article Lichtenberg optimization algorithm applied to crack tip iden...
A POWERFUL LICHTENBERG OPTIMIZATION ALGORITHM: A DAMAGE IDENTIFICATION CASE STUDY. ENGINEERING APPLICATIONS OF ARTIFICIAL INTELLIGENCE, 2020 (In press)
You can find the source codes in Matlab/Scilab language:
The comprehensive history of previously metaheuristic optimization algorithms from the beginning period up to the latest strategies, particularly between 1975 to 2019 years, have been collected and reported in the introduction section of the annexed paper:
"Article Interactive autodidactic school: A new metaheuristic optimiz...
"
In this regard, take a glance at Table 1 of this research paper could be constructive for your research insight.
Please find our recent work which elaborates the advantages of machine learning including the transfer learning and ensemble learning for the numerical optimization tasks.
Why should I use the Stochastic paint optimizer (SPO)?!
The mathematical programming is the best optimization tool with many years of strong theoretical background. Also, it is demonstrated that it can solve complex optimization problems in the scale of one million design variable, efficiently. Also, the methods are so reliable! Besides, there is mathematical proof for the existence of the solution and the globality of the optimum.
While, metaheuristics will all fail in expensive objective functions with high numbers of design variables. Also, each single run gives different solution. So you must average the designs! This is exactly shows that the algorithm is completely unreliable! Besides, it has no theoretical background and cannot be claimed as a scientific procedure. It is called a pseudo-science! In addition, it needs more computational cost and inefficient in solving real world problems.
Looking for the term “novel metaheuristic” is only the waste of time. They all have similar concept and only it is the change of the names. What about the “novel tired panda algorithm”!! You can change many of these names and publish many papers, but they would all be useless! I had some experiences based on the metaheuristics (even some of them are under review!), but I have moved to mathematical programming and no longer work in this way.
Many supervisors are looking nothing more than citation and impact factor! They are not developing real science!
Gaining-sharing knowledge-based optimization algorithm was developed in 2020. It is based on the concept of acquiring and sharing knowledge during the human life span.
You can check this algorithm by
Mohamed, A.W., Hadi, A.A. & Mohamed, A.K. Gaining-sharing knowledge based algorithm for solving optimization problems: a novel nature-inspired algorithm. Int. J. Mach. Learn. & Cyber. 11, 1501–1529 (2020). https://doi.org/10.1007/s13042-019-01053-x
Moreover, binary versions of this algorithm are also developed.
Plant Propagation Algorithm. Never mind the metaphor, it has a great paradigm.
It was invented I think in 2011, but we've recently discovered a lot of stuff about this, such as insensitivity to parameters (very good property!) and super fast convergence through parameter control.
You can try our hybrid optimizer: CPSOGSA (Constriction Coefficient based Particle Swarm Optimization and Gravitational Search Algorithm) for solving real-world problems (Rather and Bala, 2019).
I recommend you to check also the MTDE algorithm implemented by a multi-trial vector approach (MTV) which is a new and effective mechanism to improve other metaheuristic algorithms as well. The MTDE and its full source code are available here:
Article MTDE: An effective multi-trial vector-based differential evo...
You can also check the CCSA algorithm implemented by a Conscious Neighborhood-based approach which is an effective mechanism to improve other metaheuristic algorithms as well. The CCSA and its full source code are available here:
Many metaheuristics can be fit into the two main categories of Swarm Intelligence or Evolutionary Computation. A new distinct category is based on Exploration only-Exploitation only. A first example is EZ-opt:
Conference Paper The EZopt Optimisation Framework
It is worth noting that his method uses a similar definition and measurement of Exploration as Article A novel direct measure of exploration and exploitation based...
Md. Abdur Razzak Choudhury Dude you misunderstood the question. The question is "Whats are the latest meta-heuristic algorithms proposed in last years". Not tell me what do you think about Metaheuristics
I think Kenneth Sörensen just published some kind of overview paper on trends in the field. Haven't read it yet. But I will do, he usually writes pretty good (if somewhat mercilessly sometimes).
S. Sharma, R. Kapoor and S. Dhiman, "A Novel Hybrid Metaheuristic Based on Augmented Grey Wolf Optimizer and Cuckoo Search for Global Optimization,"2021 2nd International Conference on Secure Cyber Computing and Communications (ICSCCC), 2021, pp. 376-381, doi: 10.1109/ICSCCC51823.2021.9478142.
We share the source codes of all MHS algorithms we have developed. Thus, we contribute to science and researchers to make new developments. Below you can find links to source codes of powerful and modern MHS algorithms published in top quality journals in the last year.
dFDB-MRFO ( Improved Manta Ray Foraging Optimizer with dFDB
You can find the QANA: Quantum-based avian navigation optimizer algorithm and its FULL SOURCE CODES in the following links.Article QANA: Quantum-based avian navigation optimizer algorithm
Code Full Source codes QANA-Quantum-based avian navigation optimi...
I would like to recommend another latest metaheuristic algorithm, "African Vultures Optimization Algorithm: A New Nature-Inspired Metaheuristic Algorithm for Global Optimization Problems".
Article African Vultures Optimization Algorithm: A New Nature-Inspir...
Dear Nidhal, a similar metaheuristic algorithm: "Lion Optimization Algorithm (LOA)" is also present in literature.
Yazdani, M., & Jolai, F. (2016). Lion optimization algorithm (LOA): a nature-inspired metaheuristic algorithm. Journal of computational design and engineering, 3(1), 24-36.
Asthana, M., Gupta, K. D., & Kumar, A. (2020). Test Suite Optimization Using Lion Search Algorithm. In Ambient Communications and Computer Systems (pp. 77-90). Springer, Singapore.