Abstract:
Kernel ridge regression is an important nonparametric method for estimating smooth functions. We introduce a new set of conditions under which the actual rates of convergence of the kernel ridge regression estimator under both the L_2 norm and the norm of the reproducing kernel Hilbert space exceed the standard minimax rates. An application of this theory leads to a new understanding of the Kennedy-O'Hagan approach [J. R. Stat. Soc. Ser. B. Stat. Methodol., 63 (2001), pp. 425-464] for calibrating model parameters of computer simulation. We prove that, under certain conditions, the Kennedy-O'Hagan calibration estimator with a known covariance function converges to the minimizer of the norm of the residual function in the reproducing kernel Hilbert space.
Keyword:
Reprint Author's Address:
Email:
Source :
ASA Journal on Uncertainty Quantification
Year: 2020
Issue: 4
Volume: 8
Page: 1522-1547
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count: -1
Chinese Cited Count:
30 Days PV: 8
Affiliated Colleges: