You can try EvoloPy as an open source nature-inspired optimization toolbox for global optimization in Python. It includes classical and recent methods such as new harris hawks optimizer (HHO), and well-known GA, PSO, CS, BAT, FA, GWO, MVO, SSA, MFO, and WOA. It developed by EVOML research team with core members that have more than 12K citations together in optimization and machine learning. For more information or any feedback, you can visit the following websites:
There is one library called "hyperopt", you can download it with conda install or pip install. Also here is a link to very nice explanation of how to use in this case for hyper-parameters tuning, nonetheless, you may use the same approach but with a different "objective function" or "loss function":
You can try EvoloPy as an open source nature-inspired optimization toolbox for global optimization in Python. It includes classical and recent methods such as new harris hawks optimizer (HHO), and well-known GA, PSO, CS, BAT, FA, GWO, MVO, SSA, MFO, and WOA. It developed by EVOML research team with core members that have more than 12K citations together in optimization and machine learning. For more information or any feedback, you can visit the following websites:
A good work and offer a good learning foundation for the EvoloPy , an open source nature-inspired optimization toolbox for global optimization in Python
Good suggestions for GA are available in https://www.google.com/amp/s/www.researchgate.net/post/Im_looking_for_a_Genetic_Programming_Library_in_Python_for_a_classification_problem_Would_you_please_suggest_me_some_packages_to_download/amp
You could try Opytimizer: https://github.com/gugarosa/opytimizer.
It is a complete library, which means that every line of code is commented, every class, variable or method is docstringed and there are tests cases for everything (with a 100% score of code coverage). Additionally, we provide ready-to-go scripts and even integrations with the most popular frameworks, such as PyTorch, Scikit-Learn, in order to design your own optimization task. Furthermore, we provide a History object that will keep your task's history, including agents' position, fitness, and whichever values you might need.
Recently, we implemented the Genetic Programming for function optimization and we intend to further explore tree-based evolutionary algorithms.
DEAP is a novel evolutionary computation framework for rapid prototyping and testing of ideas. It seeks to make algorithms explicit and data structures transparent.