Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published April 1993 | Published
Book Section - Chapter Open

Global descent replaces gradient descent to avoid local minima problem in learning with artificial neural networks

Abstract

One of the fundamental limitations of artificial neural network learning by gradient descent is the susceptibility to local minima during training. A new approach to learning is presented in which the gradient descent rule in the backpropagation learning algorithm is replaced with a novel global descent formalism. This methodology is based on a global optimization scheme, acronymed TRUST (terminal repeller unconstrained subenergy tunneling), which formulates optimization in terms of the flow of a special deterministic dynamical system. The ability of the new dynamical system to overcome local minima with common benchmark examples and a pattern recognition example is tested. The results demonstrate that the new method does indeed escape encountered local minima, and thus finds the global minimum solution to the specific problems.

Additional Information

© 1993 IEEE.

Attached Files

Published - 00298667.pdf

Files

00298667.pdf
Files (588.5 kB)
Name Size Download all
md5:41833e2ba37a499f42df6b1b51b44f3c
588.5 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 20, 2023