For a industrial application (layout-planning) I am currently trying to globally optimize a discontinuous function f. The objective function is defined on some bounded parameter space in R^N where the dimension N of this space depends on some initiation parameter. The N lies typically between 30-100.
The goal is to run this optimization a number of times (each time for a slightly different layout) and afterwards choose the best one.
Currently I use the MLSL-algorithm provided by the NLOPT-library to compute the global minimum of the objective. Especially when N goes up the time needed for each run to obtain a good result increases a lot. This is why I am looking for a way to speed up my computations.
From the structure of the objective function I know that it is oscillating which slows down the convergence of typical global optimization-algorithms. On the other hand the function f is the sum of a differentiable function and a upper-semi-continuous step-function, so in particular f is upper-semi-continuous and almost everywhere differentiable. The objective function is bounded as well and as it is defined on a bounded set it is integrable.
My question now is: Does anyone here have experiences with optimization of such functions (or more generally noisy or black-box functions) and has experiences which algorithms work best?
Especially as I have more details about my function is it maybe possible to use a subgradient-method or first smooth out my function by let's say a smoothing kernel phi, i.e. g = f * phi, and then optimize g to obtain a result for f?