Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published January 20, 2021 | Submitted
Report Open

Architecture Agnostic Neural Networks

Abstract

In this paper, we explore an alternate method for synthesizing neural network architectures, inspired by the brain's stochastic synaptic pruning. During a person's lifetime, numerous distinct neuronal architectures are responsible for performing the same tasks. This indicates that biological neural networks are, to some degree, architecture agnostic. However, artificial networks rely on their fine-tuned weights and hand-crafted architectures for their remarkable performance. This contrast begs the question: Can we build artificial architecture agnostic neural networks? To ground this study we utilize sparse, binary neural networks that parallel the brain's circuits. Within this sparse, binary paradigm we sample many binary architectures to create families of architecture agnostic neural networks not trained via backpropagation. These high-performing network families share the same sparsity, distribution of binary weights, and succeed in both static and dynamic tasks. In summation, we create an architecture manifold search procedure to discover families or architecture agnostic neural networks.

Additional Information

Attribution 4.0 International (CC BY 4.0).

Attached Files

Submitted - 2011.02712.pdf

Files

2011.02712.pdf
Files (1.3 MB)
Name Size Download all
md5:fe7abbd6029e702f5a32b75cc2429d7e
1.3 MB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 23, 2023