Perhaps you are looking for something different from the "classical" statistical bias. Something like Lipschitz stability - you may take a look at, e.g., cite:
The pioneering contribution of Robinson [1] put the focus on single-valued Lipschitzian behavior of optimal solutions.
Steftcho P. Dokov Thank you for the reply, the fact that my query is reaching out to experts is reassuring. I am going through the contents of the paper.
Steftcho P. Dokov Your recommendations about exploring Lipschitz stability was quite a helpful insight. What I also learnt is that bi-Lipchitz functions happen to be injective as well.
To summarize, what I gathered is if the objective function for a multivariate optimization problem happens to be bi-Lipschitz, then there exists a bound for (in a sense) the total derivative of the objective function wrt input parameters.
What I am looking for is a little more than that. I am looking for enforcing a constraint on sort-of ratio between partial derivatives of the input parameters, with the hope that at any given point the objective function does not have an unboundedly higher bias/sensitivity towards any of the input parameters.
Any insight into this aspect would be of great help.
"... the rate of change in risk (objective function) ... with respect to a small percentage change in the size of a portfolio allocation weight",
however, this is a very specific example from portfolio theory.
Similarly, the notion of elasticity might be another example (for bias) from economics in attachment. The magnitude of the elasticity (ratio) could of any size, so a bound on elasticity should be imposed as a constraint, I guess.