Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published May 8, 2022 | Published + Supplemental Material + Submitted
Journal Article Open

Investigating Generalization by Controlling Normalized Margin

Abstract

Weight norm ‖w‖ and margin γ participate in learning theory via the normalized margin γ/‖w‖. Since standard neural net optimizers do not control normalized margin, it is hard to test whether this quantity causally relates to generalization. This paper designs a series of experimental studies that explicitly control normalized margin and thereby tackle two central questions. First: does normalized margin always have a causal effect on generalization? The paper finds that no -- networks can be produced where normalized margin has seemingly no relationship with generalization, counter to the theory of Bartlett et al. (2017). Second: does normalized margin ever have a causal effect on generalization? The paper finds that yes -- in a standard training setup, test performance closely tracks normalized margin. The paper suggests a Gaussian process model as a promising explanation for this behavior.

Additional Information

© 2022 by the authors. The authors are grateful to the anonymous reviewers for their helpful comments. This material is based upon work supported by the National Science Foundation Graduate Research Fellowship under Grant No. DGE-1745301. This material was also supported by the following grants: NSF #1918865; ONR N00014-21-1-2483.

Attached Files

Published - farhang22a.pdf

Submitted - 2205.03940.pdf

Supplemental Material - farhang22a-supp.zip

Files

farhang22a.pdf
Files (6.0 MB)
Name Size Download all
md5:b742c3af405d0a885061c6f5f8dae6a0
462.0 kB Preview Download
md5:bccbac478e99c25be2f86373997fecc1
5.0 MB Preview Download
md5:8b9d3330c045a7a37931e537c3c9f22d
570.6 kB Preview Download

Additional details

Created:
August 20, 2023
Modified:
October 24, 2023