Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published November 2021 | Accepted Version + Published
Journal Article Open

On universal approximation and error bounds for Fourier Neural Operators

Abstract

Fourier neural operators (FNOs) have recently been proposed as an effective framework for learning operators that map between infinite-dimensional spaces. We prove that FNOs are universal, in the sense that they can approximate any continuous operator to desired accuracy. Moreover, we suggest a mechanism by which FNOs can approximate operators associated with PDEs efficiently. Explicit error bounds are derived to show that the size of the FNO, approximating operators associated with a Darcy type elliptic PDE and with the incompressible Navier-Stokes equations of fluid dynamics, only increases sub (log)-linearly in terms of the reciprocal of the error. Thus, FNOs are shown to efficiently approximate operators arising in a large class of PDEs.

Additional Information

© 2021 Nikola Kovachki and Samuel Lanthaler and Siddhartha Mishra. License: CC-BY 4.0, see https://creativecommons.org/licenses/by/4.0/. Attribution requirements are provided at http://jmlr.org/papers/v22/21-0806.html. Submitted 7/21; Revised 10/21; Published 11/21.

Attached Files

Published - 21-0806.pdf

Accepted Version - 2107.07562.pdf

Files

2107.07562.pdf
Files (1.4 MB)
Name Size Download all
md5:2a1598801fc2ed807f0367ea114f24e0
723.4 kB Preview Download
md5:d10c5f42bcb426dbd108a80c0f0a68c3
708.2 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023