Stable extrapolation of analytic functions
L Demanet, A Townsend - Foundations of Computational Mathematics, 2019 - Springer
L Demanet, A Townsend
Foundations of Computational Mathematics, 2019•SpringerThis paper examines the problem of extrapolation of an analytic function for x> 1 given N+ 1
perturbed samples from an equally spaced grid on [-1, 1]. For a function f on [-1, 1] that is
analytic in a Bernstein ellipse with parameter ρ> 1, and for a uniform perturbation level ε on
the function samples, we construct an asymptotically best extrapolant e (x) as a least
squares polynomial approximant of degree M∗ determined explicitly. We show that the
extrapolant e (x) converges to f (x) pointwise in the interval I ρ∈[1,(ρ+ ρ-1)/2) as ε→ 0, at a …
perturbed samples from an equally spaced grid on [-1, 1]. For a function f on [-1, 1] that is
analytic in a Bernstein ellipse with parameter ρ> 1, and for a uniform perturbation level ε on
the function samples, we construct an asymptotically best extrapolant e (x) as a least
squares polynomial approximant of degree M∗ determined explicitly. We show that the
extrapolant e (x) converges to f (x) pointwise in the interval I ρ∈[1,(ρ+ ρ-1)/2) as ε→ 0, at a …
This paper examines the problem of extrapolation of an analytic function for x> 1 given N+ 1 perturbed samples from an equally spaced grid on [-1, 1]. For a function f on [-1, 1] that is analytic in a Bernstein ellipse with parameter ρ> 1, and for a uniform perturbation level ε on the function samples, we construct an asymptotically best extrapolant e (x) as a least squares polynomial approximant of degree M∗ determined explicitly. We show that the extrapolant e (x) converges to f (x) pointwise in the interval I ρ∈[1,(ρ+ ρ-1)/2) as ε→ 0, at a rate given by ax-dependent fractional power of ε. More precisely, for each x∈ I ρ we have| f (x)-e (x)|= O ε-log r (x)/log ρ, r (x)= x+ x 2-1 ρ, up to log factors, provided that an oversampling conditioning is satisfied, viz. M∗≤ 1 2 N, which is known to be needed from approximation theory. In short, extrapolation enjoys a weak form of stability, up to a fraction of the characteristic smoothness length. The number of function samples does not bear on the size of the extrapolation error provided that it obeys the oversampling condition. We also show that one cannot construct an asymptotically more accurate extrapolant from equally spaced samples than e (x), using any other linear or nonlinear procedure. The proofs involve original statements on the stability of polynomial approximation in the Chebyshev basis from equally spaced samples and these are expected to be of independent interest.
Springer
Showing the best result for this search. See all results