Don’t Waste Time Searching – Get What You Need Instantly!
https://www.SceneTime.com

Adcock B. On Efficient Algorithms for Computing Near-Best Polynomial Approx.2024

Magnet download icon for Adcock B. On Efficient Algorithms for Computing Near-Best Polynomial Approx.2024 Download this torrent!

Adcock B. On Efficient Algorithms for Computing Near-Best Polynomial Approx.2024

To start this P2P download, you have to install a BitTorrent client like qBittorrent

Category: Other
Total size: 1.58 MB
Added: 1 week ago (2025-07-24 09:15:01)

Share ratio: 42 seeders, 0 leechers
Info Hash: 306C56B93516A8B7E73E490F7E08C2CFC17199B9
Last updated: 11 hours ago (2025-07-31 01:45:54)

Description:

Textbook in PDF format Sparse polynomial approximation has become an indispensable technique for approximating smooth, high- or infinite-dimensional functions from limited samples. This is a key task in computational science and engineering – e.g., surrogate modeling uncertainty quantification, wherein the underlying function is the solution map of a parametric or stochastic Differential Equation (DE). Yet, sparse polynomial approximation lacks a complete theory. On the one hand, there is a well-developed theory of best s-term polynomial approximation, which asserts exponential or algebraic rates of convergence for holomorphic functions. On the other hand, there are increasingly mature methods such as (weighted) `1 -minimization for computing such approximations. While the sample complexity of these methods has been analyzed through compressed sensing theory, whether they achieve the rates of the best s-term approximation is not fully understood. Furthermore, these methods are not algorithms per se, since they involve exact minimizers of nonlinear optimization problems. This work closes these gaps. Specifically, we consider the following question: are there robust, efficient algorithms for computing sparse polynomial approximations to finiteor infinite-dimensional, holomorphic and Hilbert-valued functions from limited samples that achieve the same rates as the best s-term approximation? We answer this affirmatively by introducing algorithms with exponential or algebraic convergence rates that are also robust to sampling, algorithmic and physical discretization errors. We tackle both scalar- and Hilbert-valued functions, this being particularly relevant in parametric or stochastic DEs. Our results involve several significant developments of existing techniques, including a novel restarted primal-dual iteration for solving weighted `1-minimization problems in Hilbert spaces. Our theory is supplemented by numerical experiments demonstrating the practical efficacy of these algorithms Keywords. High-dimensional approximation, polynomial approximation, best s-term approximation, compressed sensing, parametric PDEs