The performance of simulated system is determined by several parameters and its performance can only be measured by simulation. In this situation, how to get an optimized design for the system?
Dear Liu Xiaolu. What do you mean by "optimized design for the system"? When you want to increase the precision of a surrogate model, I'd suggest to perform parameter estimation, i.e. compare output of surrogate model and the real system and adapt the parameters of the model accordingly.
Hello Liu Xiaolu, obviously you have developed a model (your surrogated model) of a real System, this could / should serve as a Basis to compare your modelled i.e. simulated System to. Depending on your model, the approach and the Level of modelling details, you might be able to make a direct comparison of absolute parameter values as suggested by Radoslav Paulen or at least regarding relative changes or tendencies of similar parameters. You need to find out, if your model works reasonably well (evaluation) and does what what the real system does regarding your parameters of interests (justification). On basis of a parameter estimation as suggested by Radoslav you can either just adapt parameters or change / improve your model design (correcting / expanding).
Improving accuracy and compensating for uncertainty in surrogate modeling
Abstract:
In most engineering fields, numerical simulators are used to model complex phenomena and obtain high-fidelity analysis. Despite the growth of computer capabilities, such simulators are limited by their computational cost, since it can take up to days for a single simulation. Surrogate modeling is a popular method to limit the computational expense. It consists of replacing the expensive model by a simpler model (surrogate) fitted to a few chosen simulations at a set of points called a design of experiments (DoE).
By definition, a surrogate model contains uncertainties, since it is an approximation to an unknown function. A surrogate inherits uncertainties from two main sources: uncertainty in the design of experiment, and uncertainty due to the lack of data. Indeed, the simulator is an approximation to a real phenomenon, and the confidence one can put in its responses depends on the quality of the approximation. The properties of the surrogate model strongly depend on the accuracy of the data. Secondly, in most applications, the size of the DoE is severely limited by the computational cost. Hence, the available information is insufficient to fit accurately the surrogate model.
One of the major challenges in surrogate modeling consists of controlling and compensating for these uncertainties. The present thesis proposes three ways to address these issues: (1) by generating datasets that compensate for their uncertainty by being on the conservative side for the analysis, (2) by using statistical or prior information to obtain conservative prediction from surrogate analysis, (3) by generating designs of experiments that minimizes the uncertainty of the surrogate model.
The first part considers the problem of simulators based on Monte-Carlo simulations (MCS), where the response of interest is a reliability measure of a system. By using resampling methods (bootstrapping), we compensate from the uncertainty in the analysis by being on the conservative side, with reasonable impact on the accuracy of the response. An application to the optimization of a laminate composite with reliability constraints is proposed, the constraint being approximated by a polynomial surrogate based on conservative data.
The second part addresses the issue of generating conservative predictions using surrogate models. Three techniques are proposed: (1) biasing the surrogate by modifying the fitting process, (2) using prior information on the confidence on the data (safety factors and margins), (3) using statistical information provided by the surrogate analysis. The analysis of the different techniques is supported with the help of an analytical and a structural problem that uses finite elements analysis.
In the third part, we propose an objective-based approach to surrogate modeling, based on the idea that the uncertainty may be reduced where it is most useful, instead of globally. An original criterion is proposed to choose sequentially the design of experiments, when the surrogate needs to be accurate for certain levels of the simulator response. The criterion is a trade-off between the reduction of uncertainty in the surrogate, and the local exploration of target regions. The effectiveness of the method is shown on a simple reliability analysis application.
The last part considers the framework of simulators which fidelity depends on tunable factors that control the complexity of the model (such as MCS-based simulators). For each simulation run, the user has to set a trade-off between computational cost and response precision. When a global computational budget for the DoE is given, one may have to answer the following questions: (1) is it better to run a few accurate simulations or a large number of inaccurate ones, (2) is it possible to improve the surrogate by tuning different fidelities for each run. Answers are proposed for these two questions. For polynomial regression, it joins the well-explored theory of design optimality. For kriging, both numerical and analytical results are proposed; in particular, asymptotic results are given when the number of simulation runs tends to infinity.