Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 1991 | Published
Journal Article Open

Contrastive learning and neural oscillations

Abstract

The concept of Contrastive Learning (CL) is developed as a family of possible learning algorithms for neural networks. CL is an extension of Deterministic Boltzmann Machines to more general dynamical systems. During learning, the network oscillates between two phases. One phase has a teacher signal and one phase has no teacher signal. The weights are updated using a learning rule that corresponds to gradient descent on a contrast function that measures the discrepancy between the free network and the network with a teacher signal. The CL approach provides a general unified framework for developing new learning algorithms. It also shows that many different types of clamping and teacher signals are possible. Several examples are given and an analysis of the landscape of the contrast function is proposed with some relevant predictions for the CL curves. An approach that may be suitable for collective analog implementations is described. Simulation results and possible extensions are briefly discussed together with a new conjecture regarding the function of certain oscillations in the brain. In the appendix, we also examine two extensions of contrastive learning to time-dependent trajectories.

Additional Information

© 1991 MIT Press. Received 6 February 1991; accepted 14 June 1991. We would like to thank the referee for several useful comments. This work is supported by ONR Contract NAS7-100/918 and a McDonnell-Pew grant to P.B., and AFOSR Grant ISSA-91-0017 to F.P.

Attached Files

Published - BALnc91a.pdf

Files

BALnc91a.pdf
Files (1.0 MB)
Name Size Download all
md5:b500135f168889ad62cb27da33a0080e
1.0 MB Preview Download

Additional details

Created:
August 19, 2023
Modified:
October 17, 2023