If yes, are the corresponding algorithms and programs available? Multivariate Richardson extrapolation: can anyone elaborate on the status, related algorithms, and programs?
Oh! you would like to review what the Bulirsch_Stoer method is: Numerical Recipes (Vetterling, Flannery, Teukolsky, chapter 16.4) just as introduction, and then go to Cuyt, A., Verdonk, B.: Different techniques for the construction of multivariate
rational interpolants. In Cuyt, A., ed.: Nonlinear numerical methods and rational
approximation (Wilrijk, 1987). Reidel, Dordrecht (1988) 167–190, or even the paper quoting it: "Multivariate Rational Interpolation of Scattered
Data" Stefan Becuwe, Annie Cuyt, and Brigitte Verdonk. It's a preprint, which for your benefit I attach. As for the programs, it depends on what is the context of "multivariate", because in language R, implementations are possible to Partial Differential Equations (you use the method of lines, and then perhaps Bulirsch_Stoer, which is a flavour of Richardson's extrapolation); you can read about all this of language R in "Solving Differential Equations in R", Karline Soetaert, Jeff Cash, Francesca Mazzia. Springer, 2012.
Absolutely, Richardson's method is an approximation for an unknown in one variable; hence, one may extend his approach to multiple variables. The concept of Richardson's method is to use several approximations for a function and then use all such approximations to produce one single accurate approximation. There is no reason this could not be done for mutliple variables. One thing, Richardson's method is not quite a rational approximation, in the true sense; but, more a clever algebraic solution that uses several poor approximations for a function, combining them and producing one accurate result, based upon the poor approximations, thus, reducing the overall error.
To be a bit more specific: in one variable, Richardson extrapolation is related to the divided differences. So I would be interested to find out whether someone used generalizations of divided differences in several variables to solve the algorithmic problem of Richardson extrapolation.
To be more precise, Richardson is based on a discretization method and it uses the asymptotic expansion of the discretization error. In fact, it eliminates the lowest (or more) order(s) in the error-expansion from multiple realizations of the discretization.
This all can be done equally well in one or more dimensions.
If you understand the principle of RE and the discretization method used, to construct the RE in the particular case is a straightforward exercise.
Thank you for the answers. They point in some directions that may be helpful. However, I have a certain application in mind that would benefit from multidimensional analogs of divided differences. Here, I found so far only
Frozen divided difference scheme for solving systems of nonlinear
Journal of Computational and Applied Mathematics 235 (2011) 1739–1743
There, first divided differences are defined via an integral of the first derivative of the function, that gives rise to a representation of the first divided differences in terms of a matrix with elements of certain one-dimensional divided differences.
You are dealing with vector extrapolation methods. I suggest, e.g., the book, C. Brezinski, M. Redivo-Zaglia, Extrapolation methods: theory and practice, North-Holland, Amsterdam (1991), and the review paper: D.A. Smith, W.F. Ford, A. Sidi, Extrapolation methods for vector sequences. SIAM Review 29, 199-233 (1987) (with corrections in SIAM Rev. 30, 623-624 (1988)).
Dear Ivan, I will have a look at the sparse grid technique, many thanks for the pointer.
Dear Stefano, I was looking in the book of Brezinski and Redivo Zaglia. But I could not find exactly what I was looking for. But of course you are right that vector extrapolation is the relevant setting. Thank you for the pointer, too.