It is clear that meta-heiristic optimization algorithm is needed with the objective is to Maximize the availability of the medical supplies with constraints such as import restrictions. The population could consists of those who consume these supplies such as pharmacies, normal people, medical bodies...etc. You can also give more weights to one of these variables.
Resource allocation optimization in many shapes and forms (e.g., hospitable beds location), production and distribution of everything you can think of, scheduling of nurses/doctors, waste disposal, ... there is no end!
In computer science and operations research, a genetic algorithm (GA) is a metaheuristic inspired by the process of natural selection that belongs to the larger class of evolutionary algorithms (EA). GA is an optimization algorithm according to many literature and according to many problems solved by GA.
According to Wikipedia https://en.wikipedia.org/wiki/Genetic_algorithm
COIVD-19 related processes are highly stochastic by nature. I would think first of all about multistage stochastic optimization. However there is a problem of multiple criteria to be taken into account and missing information on probabilities.
Mohamad M. Awad It is definitely a weaker form of solution approach, as you should know. The problem you state will more than likely not lead you to an optimum, but somewhere else. That is the nature of metaheuristics, you see, and if you look it up at other science fora you will see.
One sample yields not a theory. In the majority of cases mathematical optimization is to prefer, as the methods we use are based on the optimality principles that we utilize, all the way back to Karush, Kuhn, and Tucker.
Many thanks for the question and replies already posted. To me, the question is not clear. Perhaps, the question is what methods should be used to solve optimization problems related to the various situations arising from the spread of covid-19. This is what the replies have addressed. If so, a method (be it linear programming, quadratic programming, nonlinear programming or mixed-integer nonlinear programming etc.) appropriate for the optimization problem should be chosen and used. This should be the approach for solving any optimization problem.
Metaheuristics are not derived from any principles of optimality; that is what I mean. To take a very simple example in unconstrained optimization, a criterion for a vector to be a solution candidate is that all partial derivatives are zero. If we have constraints, we instead check the Karush-Kuhn-Tucker (KKT) conditions to see whether the vector we find might be optimal.
Metaheuristics on the other hand are not built on any valid stopping criterion, in contrast to the two examples above in unconstrained optimization, and constrained optimization. Also, the two optimization methods are based on a descending sequence of objective values. I know of few heuristics that also provide, if possible, a descending sequence of objective values, like simulated annealing, but that methodology cannot guarantee to find an optimum or a stationary point.
What I am trying to say is that the world of mathematical optimization provide much more mathematically rigorous guarantees to find a vector that have the right properties - such as the KKT conditions are satisfied, or the last vector is a stationary point.
I am agreeing with professor Rangaiah, on existing methods of optimization. But, I feel that more specific information is required to reach the destination.
If the question is the condition optimization for the Coronavirus spreads and infects healthy people, I think the metaheuristic or Mixed-integer nonlinear programming are preferable.
To very well utilise stratified sampling will be very important - as we cannot test everything, and so we need to shrink the samples, and yet have enough to go by so that the turnout is representative. This requires a rather sophisticated scholar with real mathematical optimisation skills.
It is rather sad to read the posts that refer - very quickly - to what optimization routine we should use (always a metaheuristic, of course - what else??), when we have not even begun to hear/read what type of optimization problem we should construct! And how to provide the right data! And to look at the model's properties, in order to solve it in the best way possible (that is, the most efficient way).
I have said it before, and I say it again: first make up your mind what the optimization problem should be, in terms of variable definition, constraints that make sense, and a suitable objective function.
Only THEN - when an optimization model is actually in place (following an analysis on what variable declaration is the most suitable! - that is extremely important!!) - that we can analyse, we can now debate what methodology for its solution we should use, or develop.
It's blatantly clear that there is something fundamental in the thinking in the minds of those who are keen on (only) meta-heuristics, and it is this: The model itself, and how to best represent it in an optimization problem seem to be immaterial, as the only interest is to run it.
With that mindset, you will fail, every time. The model is crucial, in tandem with the solution tools that you have. Now, when we have an optimisation model that makes sense, can we fill it with the right data (also a rather complex undertaking!), and see how it should be solved.
But in the meta-heuristics community it appears that the modelling stage - deciding upon the appearance of variables and constraints - is something to gloss over. I find that hideous, as so very wrong that it is mind-buggling.
Indeed the spread would more likely be best described through some stochastic PDE of sorts, and the attack against it a stochastic control problem of the SMPEC type.
Dear Michael Patriksson, I strongly agree with your answer about the need to pay more attention to the modeling step. I think that this topic deserves deeper elaboration. I have a dream of a book (The art of mathematical modeling) that tackles this topic thoroughly like Knuth did do in its awesome "the art of computer programming" book.
The failure to discuss the best way to set up a mathematical model before worrying about how to solve it is something that permeates the whole of RG, I would say.
Another is to have opinions that are never backed up by an analysis. In English I just now saw that the term is "guy guessing." (Somehow I think it is a cousin to mansplaining.)
My humble opinion, the optimization method used depends on the form of model adopted for the Covid-19 pandemic - the spread, the recovery and the death toll. Covid-19 infection process is a very complex and nonlinear involving many factors across different length and time scales, which means there is no universal model for this kind of phenomenon. The uncertainties involved in such a a modeling is very high including the large influence of noise in the measurements. If a model can give 60% account of the infection process, that is considered quite good given the uncertainties involved in the modeling. The optimization is used to find the model parameter values that give the best fit to the real data. But whether the model can give any useful physical insight or not depends on the model postulation and structure. Some may structures give quite a good fitting to the data but unable to shed any useful insight into the process, other than curve fitting. Some model structures are relatively poor at fitting the real data but they may be able to give 1 or 2 physical hints about the complex process.
A few days ago there was a call for papers : cover-19 and systems and control theory! We are now facing a tremendous control problem under uncertainty, where the system consists out of a lot of connected subsystems, and where there are a lot of controllers. Appropriate terms are here : feedback control, learning, uncertainty. But in order to be of any help, the chosen models should be robust !
Look up Stochastic Mathematical Problem with Equilibrium Constraints (SMPEC).
There are a few papers based on this formalism, with especially a two-part paper in the journal JOTA (Journal of Optimization Theory and Applications), written by me and Laura Wynter. If your problem fits, and you have data enough then you may create a model to run scenarios. We have used it in topology optimisation, as well as in medical applications.
Overcome? I doubt but control and analyse? Yes. There are a lot of Mathematical optimization methods to be considered in line with model formulation: optimal control, sensitivity analysis, uncertainty or stochastic approach, to mention but a few.
I am quite confident that data is available, or can dug up, but I am not sure whether SMPEC is a worthy formalism through which we can investigate efficiently the control problem(s) that need to be adressed.
Vol. 57, No. 3, June 2008, 395–418, an example of SMEC. That is essentially a two-stage Stochastic Programming problem, and, hence, in my opinion, not suitable in the context of controlling the covid-19 pandemic. Because there one is looking for suitable feedback controls, where the time index runs over several months, and where one should model when and where to take appropriate measurements, and when and how to change lock-down parameters.
As I wrote before, this typically asks for people with a reasonable background in systems and control. To be continued...
Perhaps to maximise the distance between individuals, in order to minimise the spread of covid-19. Include, if you wish, time-dependent wind direction, and other data that you can come up with. But this is not a serious post, as I am sure you have already figured out.
first you have to build the model which means you should identify the parameters or the variables that make the covid 19 itself, the constraints that affect on the live of the virus , and finally the objective of your model which is minimize the live time of the virus, i think that goal programming is the best for solving this problem
The first thing to minimize its spread is social distancing via awareness through education by states. The second is medical research, the third is digital representation and detection via computational tools, i.e., IT+AI.
It depends on whether you mean it at the macro level, i.e. to reduce the spread of the virus and then lead to eventual die-out, or you mean it at the micro level, i.e. to find mechanisms to prevent infection and the development of the virus. For the latter, optimization will not help much, the problem is too ill-defined for an optimization model to be used. For the former, you can formulate a large scale optimization model that considers the infection rates, availability of vaccines, cold storage facility locations, population densities in different regions, quarantine and treatment facilities, and all related parameters and optimize the distribution of the vaccines and other interventions to minimize the illness occurrence and the spread of the virus.
Apart of purely medical issues and epidemiology, there are many more traditional OR problems, connected with efficient resource use for organizing vaccination campaigns, storage and logistics keeping in mind limited capacities and finite living times for vaccines, etc. That can be handled by standard optimization models and algorithms.
Speaking of mathematical methods that could be useful here, what comes to my mind is stochastic programming. Of course, all prediction methods can also be taken into account.
But as one can see, the mathematical methods did not help much, because millions have died - and unfortunately the pademia continues - although it is milder.
The mathematical modelling is the ultimate solution, in particular classification of viruses, e.g., the virus is some variant of influenza or a kind or class of corona. There are several ways of analysing such viruses and all these need statistical modelling or density estimation techniques, e.g., Gaussian distribution, Von Mises Fisher, Linguine or Bingham distributions etc.