Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1988 | Published
Book Section - Chapter Open

Constrained Differential Optimization

Abstract

Many optimization models of neural networks need constraints to restrict the space of outputs to a subspace which satisfies external criteria. Optimizations using energy methods yield "forces" which act upon the state of the neural network. The penalty method, in which quadratic energy constraints are added to an existing optimization energy, has become popular recently, but is not guaranteed to satisfy the constraint conditions when there are other forces on the neural model or when there are multiple constraints. In this paper, we present the basic differential multiplier method (BDMM), which satisfies constraints exactly; we create forces which gradually apply the constraints over time, using "neurons" that estimate Lagrange multipliers. The basic differential multiplier method is a differential version of the method of multipliers from Numerical Analysis. We prove that the differential equations locally converge to a constrained minimum. Examples of applications of the differential method of multipliers include enforcing permutation codewords in the analog decoding problem and enforcing valid tours in the traveling salesman problem.

Additional Information

© American Institute of Physics 1988. This paper was supported by an AT&T Bell Laboratories fellowship (JCP).

Attached Files

Published - 4-constrained-differential-optimization.pdf

Files

4-constrained-differential-optimization.pdf
Files (2.0 MB)
Name Size Download all
md5:bd46449c2b7abb96da2df4b65fe5cf29
2.0 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 25, 2023