Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published October 2017 | Published + Submitted
Journal Article Open

Why and when can deep-but not shallow-networks avoid the curse of dimensionality: A review

Abstract

The paper reviews and extends an emerging body of theoretical results on deep learning including the conditions under which it can be exponentially better than shallow learning. A class of deep convolutional networks represent an important special case of these conditions, though weight sharing is not the main reason for their exponential advantage. Implications of a few key theorems are discussed, together with new results, open problems and conjectures.

Additional Information

© 2017 The Author(s). This article is distributed under the terms of the Creative Commons Attribution 4.0 International License (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. Manuscript received November 3, 2016; accepted December 12, 2016; published online March 14, 2017. Recommended by Associate Editor Hong Qiao. Special Issue on Human Inspired Computing. This work was supported by the Center for Brains, Minds and Machines (CBMM), NSF STC award CCF (No. 1231216), and ARO (No.W911NF-15-1-0385). The authors thank O. Shamir for useful emails that prompted us to clarify our results in the context of lower bounds.

Attached Files

Published - 10.1007_2Fs11633-017-1054-2.pdf

Submitted - 1611.00740

Files

10.1007_2Fs11633-017-1054-2.pdf
Files (4.3 MB)
Name Size Download all
md5:adbb310a3a41746c796b9901ae138402
1.8 MB Preview Download
md5:ff55635c83d58dfe55f04f09078ddf0d
2.6 MB Download

Additional details

Created:
August 21, 2023
Modified:
October 25, 2023