Seminar in Numerical Analysis: Martin Eigel (WIAS Berlin)
Weighted least squares methods have been examined thouroughly to obtain quasi-optimal convergence results for a chosen (polynomial) basis of a linear space. A focus in the analysis lies on the construction of optimal sampling measures and the derivation of a sufficient sample complexity for stable reconstructions. When considering holomorphic functions such as solutions of common parametric PDEs, the anisotropic sparsity they exhibit can be exploited to achieve improved results adapted to the considered problem. In particular, the sparsity of the data transfers to the solution sparsity in terms of polynomial chaos coefficients. When using nonlinear model classes, it turns out that the known results cannot be used directly. To obtain comparable a priori rates, we introduce a new weighted version of Stechkin's lemma. This enables to obtain optimal complexity results for a model class of low-rank tensor trains. We also show that the solution sparsity results in sparse component tensors and sketch how this can be realised in practical algorithms. A nice application is the reconstruction of Galerkin solutions for parametric PDEs. With this, a provably converging a posteriori adaptive algorithm can be derived for linear model PDEs with non-affine coefficients.
For further information about the seminar, please visit this webpage.
Export event as iCal