Does anyone know what are the free or commercial solvers used to solve multi-objective nonlinear optimization problems, and are they based on MOEA or they follow another approach?
I recommend LINGO because it supports the global search. Most solvers can output only local optimal solution. See my papers using LINGO sample codes including in code or see LINGO free version. Tell me the meaning of "MOEM"?
If you are working for a university, you can contact Lindo Systems (developers of Lingo) to request a free educational license. It is a good solver, as Dr. Shinmura suggests.
thank you for your answer MOEA stand for multiobjective evolutionary algorithms like NSGA-II or others. For LINGO, to my knowledge you have to agregate the two objective in a single one, is it always the case or you can treat both objectives simultaniously?
we have a multi-objective problem with a compromise between two conflicting objectives. when optimizing this kind of problems we don't have a unique solution but a set of solutions We can, of course, reduce the multi-objective nature of the problem, into one single objective, but here we risk losing valuable information in the aggregation phase. However, the catch is to avoid that and keep the multi-objective situation always alive during the whole process of solving the optimization problem in order to preserve the information provided by both objectives and hence their impact on the studied problem.
Markovitz define the portfolio model as one objective because the return is defined as the constraints.
Soft-margin SVM are two objective model, nevertheless it is as same as the portfolio model. You can design your models freely, not solver such as LINGO.
"we have a multi-objective problem with a compromise between two conflicting objectives. when optimizing this kind of problems we don't have a unique solution but a set of solutions We can, of course, reduce the multi-objective nature of the problem, into one single objective, but here we risk losing valuable information in the aggregation phase."
In order to solve this kind of problem and to determine efficient points there are a lot of different approaches which resort to - parametric - substitute models with just one objective and/or additional constraints.
The approach using constraints with varying right-hand-sides on all but one of the objectives is just ONE out of many different approches - there is a wealth of literature giving surveys.
So it seems, Mr. Shuichi Shinmura, the misunderstanding of multi-objective models is on your side.
Since your concern is a nonlinear multi-objective (especially bi-criteria) problem its solvability of course heavily depends on the properties of the problem:
is it a convex one?
If yes, there are many nonlinear solvers, even free ones like IPOPT. And the scalarization approach chosen should retain the convexity.
If not. finding a global solution for a one-criterion problem requires at least boundedness of all the variables in order to apply global solvers like LINDO or BARON.
I claim if you know the algorithm of MOEM, you can define the nonlinear-model by general purpose solver such as LINGO. So, I can solve soft margin SVM or many kinds of Portfolio model by LINGO that has two objectives. LINDO was old solver that could solve only LP, QP and IP, not NLP. LINDO Systems Inc. did not support this product.
I think there are five main algorithm such as LP, QP, IP, NLP and stchastic Programming. LINGO support these algorithm. LINDO Systems INC. offeres LINGO, What'sBest (WB) that is Excel add-in solver and LINDO API that is c-library for developers. These three solvers support five algorithms. LINGO support model bilding functions. I developed many discriminant functions by LINGO. See my papers and I solved the gene analysis by Matroska feature selection method developed by LINGO. Over 10 years, many researchers were strugling to analyzt hig-dimensional gene spaces. Last October, I discriminated six famous microarray datasets and solved the structuture of gens spaces completly by 16 papers. You can build MP-models by WB using visual-BASIC and WB functions. LINDO API is C-libraly to develope MP-models.
About the capability of " the problem:is it a convex one? or other questions" are the functions of NLP. And LINGO, WB and LINGO API support these functions. You can confirm it by free manuals.
I donot know which general pourpose Solver support gloval search. About this matter, you must check solver's HP.
If you wish to the rough capacity of LINGO, download my paper about LINGO model of six MP-based linear siscriminant functions including hard margin SVM and soft-margin SVM.
I also deal with a multi-objective constrained optimization problem and I encourage you to implement your own optimization software. If you have researched, you can improve an already proposed method and give an additional differential to your research. By the way, it is possible to download the NSGA-II already implemented.
Thank you Leandro Tolomeu, i did use both NSGA-ii AND AMOSA to solve my problem i just wanted to see if there is any solver to compare and check the results given by the two mentioned methods above
thank you for your advise i'll try to follow it
I also thank Dr Ralf Gollmer and Dr Shuichi Shinmura for their helpful answers
I do not know commercial solver for multiobjective NLP, though the traditional ones can be adapted to deal with scalarized problems. As said by Dr. Ralf Golmer, if your problem is convex, scalarization will be just fine: weighted sum (convex combination of objectives) or the epsilon-constraint will be perfect.
On the other hand, if you do not have convexity, then there is not a lot of solution:
- you can still be fine with scalarization. Epsilon-constraint with a global solver will do the job (as it ensures that all non-dominated can be obtained), but the tuning of the method in order to get a well spread set of solution is not trivial especially when dealing with more than 2 objectives. I am not a complete expert in such methods but you can find plenty of surveys on scalarization methods.
- if your problems are not that big (i.e. few variables and objectives) and you objectives and constraints are differentiable, you can use interval Branch & Bound (which is global and guaranteed). I have a recent paper on this subject but the implementation behind is not yet available. I dont think there are other available codes for this kind of approaches as there is not many published papers on the subject.
- maybe the more interesting approaches for your: implementing metaheuristics such as evolutionary algorithm. The advantage is that for most of these approaches, you do not need particular assumptions on your objectives and constraints. For these, you will find plenty of available codes on the net. Depending on which programming language you are more familiar with, here are some links:
+ the CEC 2009 page contains some codes that might be interesting in different languages (C++, matlab): http://dces.essex.ac.uk/staff/zhang/moeacompetition09.htm . For E.g. You will find a sample code for MOEA/D/DE
+ Some frameworks: JMetal (http://jmetal.sourceforge.net/) in Java (also available in C++) proposes implementation of well known evolutionary algorithms. I have personally only tried the Java version a few years ago, and it was quite convenient. Paraidseo (http://paradiseo.gforge.inria.fr/) in C++. I have never used this one but there is a lot of communication around it so it may easier to get into.
+ If evaluating your objectives is expensive and you have no constraints: NOMAD (in C++) is a direct search based algorithm that can be used for two objectives optimization (https://www.gerad.ca/nomad/Project/Home.html). I think it can handle more objectives, but I am not sure about that. On a similar approach, you have direct multisearch (http://www.mat.uc.pt/dms/) in Matlab.
Of course, this is not exhaustive but it is a good start.
I recommend TridentOpt. It is a Multi-Agent Genetic Algorithm. It is Multi-Objective and Non-Linear. It is an Open Source project. With an MIT License you can do anything you want with it. You can find it at this link.