Some of them are directly related to the algorithm. For example, ACO is well-suited to routing problems and genetic algorithms generally work well with binary optimization. But it doesn't mean you can't use them to other kind of problems.
The question needs to be much more specific: What's the problem?? What tools do you have at your disposal?
Is it explicit (as in all parts of the model can be spelt out with variables and functions), or is it in some ways implicit (as in the problem having a simulation or similar that needs to be run in order to compute the objective function and/and some constraints)? Do you then also know an estimate of a Lipschitz constant for the objective function?
Is time essential, or is it ok to came back for the result after a tea break, or even ok to wait until tomorrow?
These are just a few of the necessary questions to ask.
I am a nurse scientist and am now trying to conduct optimization analysis using Maple in order to test my theory (see attached). I am a novice in optimization analysis and need an expert's help. First, could I get some information on the well-structured articles (basic level) from you? Nursing science has barely published optimization-based articles, so it is hard for me to find an exemplary article. Thank you all for your help in advance!
Staff planning is a sub-field with quite a lot of papers published; scheduling, selection of further education; the logistics of a hospital layout; staff planning, as well as the planning of operations - optimally utilizing staff with the right expertize, taking into account the staff members' rights to breaks and holidays, etcetera. It's huge.
Michael Patriksson Yes, you are right. As I know, all but articles related to nursing are about nurse scheduling. But we, nurse scientists and nurse managers, do not read such articles, which are mainly published in industrial engineering or others (not nursing literature). By extension, we, nurse scientists and nurse managers, cannot understand those papers because nursing is statistics-dominated discipline (we do not learn math at all from BSN to PhD). I am sad to see that research and practice play separately.
Indeed, it's sad. So why don't you ask your manager to allow you to take courses at a local university in the fields that you need to improve yourself in? That's what I would do, and moreover it would be a very fruitful thing for the management to have staff that know more advanced procedures and techniques.
Michael Patriksson Sincerely thank you for your advice. I also agree with you. I am now merging the two datasets collected from the whole USA. Once it is prepared, I am expected to collaborate with an expert in industrial engineering while learning the method and procedures from her. First, I plan to complete my PhD as soon as possible (because my research topic is a hot issue in current nursing science). Thank you again! :-)
On a more general note: I am no longer surprised to see how several replies state "use metaheuristic A or B" - before we are even told about the shape, size and properties of the problem! It has become quite clear to me that well over half the questions posed immediately results in a recommendation to use any of the myriads of metaheuristics that flood RG. I think all of you who respond, with a reflex answer, should first ask about the properties of the models! *Not* every problem should be solved that way, ok? We do know of better tools in a vast majority of cases.
Djamel: I think "best" is a synonym to "more performant", although the latter is vague - what does "more" refer to here?
To me there are plenty of ways to define "best" - and they are typically in conflict:
1. One goal can be to implement an algorithm that uses as little space as possible - this was very much in vogue in the 60s when computer memory sucked, and we still do want to be able to solve small problems fast, on inexpensive processing units.
1. One goal is to be as accurate a possible, which might mean that we need to find an approximation of the original one, that we CAN solve, and utilize that cleverly. Certainly we have many such methods within the domain of global optimization, for example where the original model is replaced by an explicit one - for which we do have a methodology with good performance bounds.
Ok. The problem is that it is rather seldom that one methodology always - for *every* data and size - beats every other methods for the problem. Even the very best interior point solver for LP gets occasionally beaten by the simplex method, even if the former has a much better worst-case complexity.
It’s really very hard question. Indeed, I think we are far away to claim an optimisation method surely overcomes the other(s) for all datasets and problem sizes even in a specific problem class. Of course we know that there are effective and efficient algorithms, but it’s really hard to claim such a generalisation I think. Prof Michael Patriksson has already pointed this issue.
Secondly, I agree that nurse scheduling has dominated the body of literature related to industrial engineering (IE). However, I suppose that industrial engineering view could provide incredible benefits if it’s applied to nurse science. It can be said IE view is a requirement for each system, plan. So, I may help your future studies in nurse management, Claire Su-Yeon Park .
Dr. Aykan Akincilar, Thank you for your kindness. Dr. Mehmet Kabak, a vice president of Gazi University Department of Industrial Engineering in Turkey, where you got a PhD, is my favorite friend. He will help my work.
Dr. Aykan Akincilar, I have, will have a large, globle dataset. If you are interested in my theory, keep in touch: Article Park's Optimized Nurse Staffing (Sweet Spot) Estimation Theo...
Article Optimizing Staffing, Quality, and Cost in Home Healthcare Nu...
(Mehmet reviewed my paper and theory. And the paper has been published in a top 5% nursing journal and has also been featured on the cover of the journal last month.) We may create a more vibrant effect: [email protected]
Thank you for your kind share. Actually I‘m interested in optimization problems in healthcare, which is one of the most popular research fields in optimization. I left you an e-mail. Hope to hear from you.
There is no, and never will be, one and only universal method of optimization that would be suitable for solving all tasks and problems.
Of course, there are methods that work better than others, although sometimes they are unreliable for specific problems to be solved.
Global optimization algorithms and so often require tuning by means of their respective modification, or by adding another local search method.
It is conditioned by many factors including the number of variables, the manner of variability of decision variables, the size of data to be processed, and what is the optimization criterion for us , for example, the minimum production cost of the device is important for us, whether its energy efficiency or our algorithm can be assessed from the point of view of efficiency and the time of the solution returned by him will be very important to us.
If P is not equal to NP - which I would say that the majority would bet on - then we can never guarantee success. We just need to live with it - and while doing that we should rather try to solve our problems with dedicated methodologies, rather than with metaheuristics.
"For a such "general" question, I will say that the "best" method of optimization is the PROVED more performant and convergent one. "
don't lead anywhere - you are just repeating what you wrote above - and "more convergent" is non-sense: either it (referring to a particular methodology and an input) converges or not.
There is no generalized method to be called the "best" one for optimization problems. The best method is problem specific. One method can the best for a particular class of the problem whereas the same cannot be good for another class of the problem. Classification of the optimization problem is necessary before we talk about a best possible method for it. For well formulated problems "Analytical methods" will be the most suitable whereas for very complex optimization problems "Metaheusric methods" will be most efficient. Again, these two class of the methods can be problem specific.
Mahamad: You will have to be rather more specific on what the term "well formulated" means, and - also - what "very complex" means, if we are to have a solver battle. :-)
For a such "general" question, I will say that the "best" method of optimization is the proved more performant and convergent one.
Your replica was:
Djamel: I think "best" is a synonym to "more performant", although the latter is vague - what does "more" refer to here? To me there are plenty of ways to define "best" - and they are typically in conflict: 1. One goal can be to implement an algorithm that uses as little space as possible - this was very much in vogue in the 60s when computer memory sucked, and we still do want to be able to solve small problems fast, on inexpensive processing units. 1. One goal is to be as accurate a possible, which might mean that we need to find an approximation of the original one, that we CAN solve, and utilize that cleverly. Certainly we have many such methods within the domain of global optimization, for example where the original model is replaced by an explicit one - for which we do have a methodology with good performance bounds.
I wrote:
Michael: "more" here refers to a comparison use and means "compared to others"
The Convergence refers to Time and Space Complexity, when Performance deal with Accuracy.
Certainly, the balance between these two factors is not always very obvious.
For your replica:
Ok. The problem is that it is rather seldom that one methodology always - for *every* data and size - beats every other methods for the problem. Even the very best interior point solver for LP gets occasionally beaten by the simplex method, even if the former has a much better worst-case complexity.
I wrote:
I apologize, my response came later.
The replica for your last one is in the term "proved" in my first response
"" For a such "general" question, I will say that the "best" method of optimization is the PROVED more performant and convergent one. ""
For your last replica:
Djamel: tautologies like yours: "For a such "general" question, I will say that the "best" method of optimization is the PROVED more performant and convergent one. " don't lead anywhere - you are just repeating what you wrote above - and "more convergent" is non-sense: either it (referring to a particular methodology and an input) converges or not.
I will say:
I'm not repeating what I wrote above I just give my response ones again putted in double quotation mark " "......." " to point out the term "proved" (putted in capital letters) to say that the methodology has been proved performant and convergent.
I already give precision on the term "more": " "more" here refers to a comparison use and means "compared to others" "
The term "more" is used only with "performant" none can imagine that it can be used with "convergent", I supposed that was obvious!!! , and "is no sens" like you said.
Without specifying the objective function and constraints, an unambiguous answer is not possible. I used many methods. In some situations the L-M method was better and in other cases geometrical programming method or multi-step methods (eg combining of Monte Carlo method with Nelder-Maed and FPD methods) were better. Regards,
There is no such method. You can have a very good method that may not work in a particular case. You can also have a poor method that will work well somewhere.
Optimization can also be experimental. In this case, one can use the Box-Wilson method or stochastic programming. However, one should always start with the objective function (explicit, implicit or computational algorithm) and constraints. Optimization method in the next step. Regards,
The answer lies in whether you are expecting a relatively better solution or the best solution to the problem. I would say Heuristic optimization works well for discrete functions. But, if you are dealing with continuous nonlinear functions, the best way is to go for Convex optimization. Even if the function is nonconvex, there are methods available to convexify a nonconvex function, and find the global optimum in an iterative manner.
Every area of research has its own optimization algorithms. It depends on the field of research. For example, swarm intelligence and back propagation give approximate solution not best solution.
If we narrow the problem down to the search for the minimum of functions of many variables, I put the Marquardt-Levenberg method in the first place. It is a "two-in-one" method. Gradient at first, then Hessian. In addition, despite the built-in SSD, after a slight change, this method can be used to minimize any function.
There are too many types of problems to list in a forum like this. You need to tell us what problem type you have - then it might be possible to help. Optimisation is a vast area, encompassing time-dependent infinite-dimensional problems in the area of optimal control, for example. At the other end of the spectrum is linear programming, which is a very easy problem. In between are very many other problems that are very easy, or very hard.
You definitely need to tell us what the problem setting is - there is no method that can solve ALL problems - you need to specify the type of problem you are dealing with.
Michael Patriksson After all, it is not about a specific scientific problem. The question was formulated in such a way as to allow as many answers as possible. The same applies to the very popular questions: "about secret of Happiness", "about natural pictures", "about RG score" "about single author paper" etc.
I think, the most efficient methodology to optimize a problem is that (i) solve the problem with exact methods, (ii) if you are not satisfied with its solution in a reasonable time, solve the problem with the problem-specific heuristics or metaheuristics. If the solution is not as good as your expectation, try your best, and develop a new algorithm. This approach might be more beneficial instead of focusing on a single optimization method.
Not every problem can even be attacked through metaheuristics; how would you solve an optimal control problem, or a bilevel optimisation problem, using that tool? This is bordering on non-sense, without any more details.
By the same token, the question is simply too unspecific to answer.
H. Naderpour You need to become more detailed - otherwise there can be no answer.
They can be applied to bilevel optimization problem. Please check the studies: structural optimization (topology, shape and cross-sectional area optimization o structural trusses) by metaheuristics.
Well, I happen to have a few papers on topology optimisation, and none of them were based on metaheuristics, but on nonlinear programming - in fact bilevel optimisation, which was highly successful because of favourable properties of the "inner" optimisation problem - or the "lower-level problem" as it sometimes is called. In every case we received global optima, since our approach is a global one. I believe there are industrial enterprises that still use one of our codes. Like clockwork.
Topology optimisation is a mature science domain, and there are many papers written that describe how to optimise structures; I am sure you can borrow from plenty of papers in this field, with very good results. I wish you good look and look forward to hear how it goes. :-)
Rick Manner It's still wrong, whether it is in two dimensions or in the thousands:
In your vocabulary, you are evidently looking for a maximum of the function. Well, if your feasible set is not full-dimensional (as in not having an interior), you need to come up with a way to find a direction that does not immediately throw you out from the feasible region, or you need to project the point that you got onto the feasible set.
The most elementary method (the projection method) is like the steepest ascent method, where you pick a rather short step length in the direction of the gradient of the objective function, and then force the next iterate to be the point that is the closest in the feasible region. In the case that the feasible set is a polyhedron this is easy (by performing a Euclidian projection onto the feasible set), but for more general sets - such as defined by nonlinear constraints - you have a numerical problem. Sequential Quadratic Programming (SQP) is the general method for such problems. The result of an execution of the SQP method depends on the convexity properties of the original problem, and for an efficient execution there are certain "fixes" needed so that the directions you move in are always ascent ones.
This is not at all trivial, especially if the problem is non-convex. It took many years to build a foundation for methods like these, and since nonlinear programming is vast, you WILL end up with numerical problems sooner or later, especially in high dimensions.
Yassine Idel Mahjoub Not at all, in fact: in linear optimisation we have at least two natural options - the simplex method, or any of the variants of interior point methods. While the simplex method is simple, and also quite natural when you look at a two-dimensional problem, it is in general not the fastest - most interior-point methods are faster in practice. If the problem stems from a network flow problem, then the simplex problem is quick, too.
This depends on the nature of the problem to be optimized. I do not think that there is a general optimization model or algorithm which can be applied for any problem generally . Even for (students) problems sheets, an optimization model could fit for a problem and be useless for another.
That's what I said earlier: but if the problem is, for example, a linear optimisation problem, then we do have a tool that solves every instance (that does have an optimal solution). Not all other problems have that luxury.
Michael Patriksson That's interesting. my comment was actually simplified description of the steepest ascent method. Keep going uphill until you reach the top it always worked for me with just a few minor modifications.
Well, when you have constraints in the problem, you do need to adjust the procedure somewhat. Here is an elementary description of the "gradient projection method" - known since 1960 something:
You do use the steepest descent direction, and adjust a step length in order to lower the value of the objective function, followed (possibly) by a Euclidian projection onto the feasible set (if that point falls outside of the feasible set). This procedure is used almost exclusively when the feasible set is a convex polyhedron, as those types of projections onto sets are easy to perform very fast.
I became very familiar with these things as I was a postdoc at the University of Washington in Seattle, with R.T. Rockafellar; Jim Burke is also a professor at that institute.
I also find it rather nice to see that this procedure for problems WITH constraints are IDENTICAL to that more general procedure when there are NO constraints in the problem. Without the constraints the procedure becomes just the steepest descent method. I like these synergies very much.
Rick Manner In your perfect post you highlighted the most important issues in the search for the maximum: the starting point, the right direction of improvement, the need to modify the step near the maximum, local maxima and the stop criteria. In case of limitations, obviousy, one do not necessarily have to "climb to the top of the hill", but stay on its slope at one point or at the same level. But that is a completely different problem.
Michael Patriksson Well, Most of problems discussed on RG is not fully specified. Often one have to guess what the question is about. The questioner is not always interested in the answer either.
Moving on to search for the extreme of functions with or without constraints, there is a lot of news here. But much of the commonly used methods were developed between the 1940s and 1970s. You propose to use the "gradient projection method" but it is easier to modify any method using the penalty function.
I'm pretty sure my post was not "perfect", But it is probably good enough for most cases. I usually back it up by keeping track of past results and reverting o interval halving when steepest ascent experiences problems,
You have described what is most important in the search for the maximum in a very clear and illustrative way. You used the analogy of climbing to the top of a hill, step by step, and looking side to side and around. I liked it very much.
This is a very general problem. The best optimization method depends on the problem at hand. If the problem has a special structure, you may benefit from decomposition or methods that employ side constraints on top of some other special method (like network simplex). If the problem is a hard integer programming problem, then you may benefit from generating specialized cuts and use branch-and-cut type of a solution method. Metaheuristics are usually the last resort after testing and trying the algorithms that guarantee optimality.
In all honesty, no one can give you an exact answer to this question. So many factors are involved such as problem type, number of constraints, etc. However, I would advise to use the correct "optimization method" for the correct problem. Whenever possible, use exact methods to yield solutions (preferably differential calculus), otherwise you can sacrifice solution quality for computational speed and use metaheuristics. Try doing a quick literature review of previous solution attempts for your problem. That should help you decide which of the methods work best.