Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published February 25, 2022 | Submitted
Report Open

LyaNet: A Lyapunov Framework for Training Neural ODEs

Abstract

We propose a method for training ordinary differential equations by using a control-theoretic Lyapunov condition for stability. Our approach, called LyaNet, is based on a novel Lyapunov loss formulation that encourages the inference dynamics to converge quickly to the correct prediction. Theoretically, we show that minimizing Lyapunov loss guarantees exponential convergence to the correct solution and enables a novel robustness guarantee. We also provide practical algorithms, including one that avoids the cost of backpropagating through a solver or using the adjoint method. Relative to standard Neural ODE training, we empirically find that LyaNet can offer improved prediction performance, faster convergence of inference dynamics, and improved adversarial robustness. Our code available at https://github.com/ivandariojr/LyapunovLearning.

Attached Files

Submitted - 2202.02526.pdf

Files

2202.02526.pdf
Files (1.7 MB)
Name Size Download all
md5:9de39f874818039f7bb2dfaef650b60e
1.7 MB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023