Published April 1, 1987
| Published
Journal Article
Open
Neural Computation by Concentrating Information in Time
- Creators
-
Tank, D. W.
- Hopfield, J. J.
Chicago
Abstract
An analog model neural network that can solve a general problem of recognizing patterns in a time-dependent signal is presented. The networks use a patterned set of delays to collectively focus stimulus sequence information to a neural state at a future time. The computational capabilities of the circuit are demonstrated on tasks somewhat similar to those necessary for the recognition of words in a continuous stream of speech. The network architecture can be understood from consideration of an energy function that is being minimized as the circuit computes. Neurobiological mechanisms are known for the generation of appropriate delays.
Additional Information
© 1987 by the National Academy of Sciences. Contributed by J. J. Hopfield, November 25, 1986. We thank S. E. Levinson, O. Ghitza, and F. Jelinek for helpful discussions on automatic speech recognition systems. The work at California Institute of Technology was supported by National Science Foundation Grant PCM-8406049. The publication costs of this article were defrayed in part by page charge payment. This article must therefore be hereby marked "advertisement" in accordance with 18 U.S.C. §1734 solely to indicate this fact.Attached Files
Published - TANpnas87.pdf
Files
TANpnas87.pdf
Files
(1.1 MB)
Name | Size | Download all |
---|---|---|
md5:cad9253af5cb442c6ad4f2aae5edee1a
|
1.1 MB | Preview Download |
Additional details
- PMCID
- PMC304548
- Eprint ID
- 10363
- Resolver ID
- CaltechAUTHORS:TANpnas87
- NSF
- PCM-8406049
- Created
-
2008-05-01Created from EPrint's datestamp field
- Updated
-
2021-11-08Created from EPrint's last_modified field