Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 1, 2022 | public
Journal Article

Learning "best" kernels from data in Gaussian process regression. With application to aerodynamics

Abstract

This paper introduces algorithms to select/design kernels in Gaussian process regression/kriging surrogate modeling techniques. We adopt the setting of kernel method solutions in ad hoc functional spaces, namely Reproducing Kernel Hilbert Spaces (RKHS), to solve the problem of approximating a regular target function given observations of it, i.e. supervised learning. A first class of algorithms is kernel flow, which was introduced in the context of classification in machine learning. It can be seen as a cross-validation procedure whereby a "best" kernel is selected such that the loss of accuracy incurred by removing some part of the dataset (typically half of it) is minimized. A second class of algorithms is called spectral kernel ridge regression, and aims at selecting a "best" kernel such that the norm of the function to be approximated is minimal in the associated RKHS. Within Mercer's theorem framework, we obtain an explicit construction of that "best" kernel in terms of the main features of the target function. Both approaches of learning kernels from data are illustrated by numerical examples on synthetic test functions, and on a classical test case in turbulence modeling validation for transonic flows about a two-dimensional airfoil.

Additional details

Created:
August 22, 2023
Modified:
October 24, 2023