I would like to emulate a FE model of a stochastic, heterogenous, anisotropic material, subjected to a loading. Due to high computational costs of this model I would like to replace the FE model by a metamodel. Can anyone give me some suggestions?
you should definitely try out Kriging and polynomial chaos expansions as surrogate models.
You can find surrogate modelling in the software UQLab (www.uqlab.com), which is Matlab-based, free of use for academics, and gathers state-of-the-art algorithms for this purpose (but also global sensitivity analysis, reliability, etc.)
Another alternative is to model your FE as an heterogeneous orthotropic material, the closest approximation to anisotropy.
I have done that for bone. Please refer to for more information:
Geraldes D. M., and Phillips A. T. M. (2014), A comparative study of orthotropic and isotropic bone adaptation in the femur, International Journal for Numerical Methods in Biomedical Engineering, 30, pages 873–889, doi: 10.1002/cnm.2633
If you are looking for simplified models based on the mechanics, then the previous answers may be useful.
Yet I guess you are looking for a technique to replace your finite element model by a response surface (ie a meta-model) which is then fast to run, so that you can change some of the parameters of your FE analysis (i.e material constants describing the anisotropy, loading , etc.) and have the response without running the FE model again and again. Am I correct?
If so, the most efficient methods today are polynomial chaos expansions and Gaussian process modelling (also known as Kriging). The efficiency depends on: the number of parameters that are varied, and how linear are the output quantities of interests w.r.t to these parameters, number of runs of the FE model you can afford to build up the meta-model, etc.
If you give us more details, I could be more specific.
It really depends on the nature of your material, and the data you have. Material homogenization can be very general.
Some people use PGD (proper generalized decomposition) in space to reduce the anisotropy problem. You may also want to look at implementation of surrogate models, using statistical regression algorithms (common and popular in structural health monitoring applications) based on experimental data or numerical samples.
As was suggested by the above authors, you can use Design of Experiments (DoE) thoery and so called surrogate models (Response Surface Models) with or without Kriging where you interpolate between computed results as a way to speed up optimization. This works quite well for many situations, but not all. One then must shift to a global optimizer such as Genetic Algortihms or similar.
DoE is very powerful, in particular when used to screen the design space for dominant variables usign, e.g. Taguchi type tests.
I used to do this using Optimus which is nice & easy to use system
http://www.noesissolutions.com/
The neat thing is first to automate the analysis sequence, then to select the method of execution (Table, optimization, or DoE, etc) and to execute in parallell when possible.
Here is one such example where DoE was used to arrive at dB/kg curves where each point on the dB/Kg curve is optimum (as there was a trade off between noise and weight).
you should definitely try out Kriging and polynomial chaos expansions as surrogate models.
You can find surrogate modelling in the software UQLab (www.uqlab.com), which is Matlab-based, free of use for academics, and gathers state-of-the-art algorithms for this purpose (but also global sensitivity analysis, reliability, etc.)