Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published December 2019 | Supplemental Material + Submitted + Published
Book Section - Chapter Open

Neural networks grown and self-organized by noise

Abstract

Living neural networks emerge through a process of growth and self-organization that begins with a single cell and results in a brain, an organized and functional computational device. Artificial neural networks, however, rely on human-designed, hand-programmed architectures for their remarkable performance. Can we develop artificial computational devices that can grow and self-organize without human intervention? In this paper, we propose a biologically inspired developmental algorithm that can 'grow' a functional, layered neural network from a single initial cell. The algorithm organizes inter-layer connections to construct retinotopic pooling layers. Our approach is inspired by the mechanisms employed by the early visual system to wire the retina to the lateral geniculate nucleus (LGN), days before animals open their eyes. The key ingredients for robust self-organization are an emergent spontaneous spatiotemporal activity wave in the first layer and a local learning rule in the second layer that 'learns' the underlying activity pattern in the first layer. The algorithm is adaptable to a wide-range of input-layer geometries, robust to malfunctioning units in the first layer, and so can be used to successfully grow and self-organize pooling architectures of different pool-sizes and shapes. The algorithm provides a primitive procedure for constructing layered neural networks through growth and self-organization. We also demonstrate that networks grown from a single unit perform as well as hand-crafted networks on MNIST. Broadly, our work shows that biologically inspired developmental algorithms can be applied to autonomously grow functional `brains' in-silico.

Additional Information

© 2019 Neural Information Processing Systems Foundation, Inc. We would like to thank Markus Meister, Carlos Lois, ErikWinfree, Naama Barkai for their invaluable contribution for shaping the early stages of the work. We also thank Alex Farhang, Jerry Wang, Tony Zhang, Matt Rosenberg, David Brown, Ben Hosheit, Varun Wadia, Gautam Goel, Adrianne Zhong and Nasim Rahaman for their constructive feedback and key edits that have helped shape this paper.

Attached Files

Published - 8465-neural-networks-grown-and-self-organized-by-noise.pdf

Submitted - 1906.01039.pdf

Supplemental Material - 8465-neural-networks-grown-and-self-organized-by-noise-supplemental.zip

Files

8465-neural-networks-grown-and-self-organized-by-noise-supplemental.zip
Files (10.9 MB)

Additional details

Created:
August 19, 2023
Modified:
March 5, 2024