Published September 1, 1988
| public
Journal Article
Open
A generalized convergence theorem for neural networks
- Creators
-
Bruck, Jehoshua
- Goodman, Joseph W.
Chicago
Abstract
A neural network model is presented in which each neuron performs a threshold logic function. The model always converges to a stable state when operating in a serial mode and to a cycle of length at most 2 when operating in a fully parallel mode. This property is the basis for the potential applications of the model, such as associative memory devices and combinatorial optimization. The two convergence theorems (for serial and fully parallel modes of operation) are reviewed, and a general convergence theorem is presented that unifies the two known cases. New relations between the neural network model and the problem of finding a minimum cut in a graph are obtained.
Additional Information
© Copyright 1988 IEEE. Reprinted with permission. Manuscript received June 1987; revised November 1987. This work was supported in part by the Rothschild Foundation and by the U.S. Air Force Office of Scientific Research. This correspondence was presented in part at the IEEE First International Conference on Neural Networks, San Diego, CA, June 1987.Files
BRUieeetit88.pdf
Files
(502.6 kB)
Name | Size | Download all |
---|---|---|
md5:6a2364369a513fed3dce8fb27f2fcedc
|
502.6 kB | Preview Download |
Additional details
- Eprint ID
- 5705
- Resolver ID
- CaltechAUTHORS:BRUieeetit88
- Created
-
2006-10-29Created from EPrint's datestamp field
- Updated
-
2021-11-08Created from EPrint's last_modified field