Indexed by:
Abstract:
Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions under which the actual rates of convergence of the kernel ridge regression estimator under both the L2 norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy-O'Hagan approach [J. R. Stat. Soc. Ser. B. Stat. Methodol., 63 (2001), pp. 425-464] for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy-O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space. © 2020 Society for Industrial and Applied Mathematics and American Statistical Association
Keyword:
Reprint Author's Address:
Email:
Source :
SIAM-ASA Journal on Uncertainty Quantification
Year: 2021
Issue: 4
Volume: 8
Page: 1522-1547
2 . 0 0 0
JCR@2022
JCR Journal Grade:2
Cited Count:
SCOPUS Cited Count: 16
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 11
Affiliated Colleges: