Welcome to the new version of CaltechAUTHORS. Login is currently restricted to library staff. If you notice any issues, please email coda@library.caltech.edu
Published 2007 | Published
Book Section - Chapter Open

Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent

Abstract

The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose a family of random coordinate descent algorithms to directly minimize the 0/1 loss for perceptrons, and prove their convergence. Our algorithms are computationally efficient, and usually achieve the lowest 0/1 loss compared with other algorithms. Such advantages make them favorable for nonseparable real-world problems. Experiments show that our algorithms are especially useful for ensemble learning, and could achieve the lowest test error for many complex data sets when coupled with AdaBoost.

Additional Information

© 2007 IEEE. We wish to thank Yaser Abu-Mostafa and Amrit Pratap for many valuable discussions. This work was mainly supported by the Caltech SISL Graduate Fellowship.

Attached Files

Published - Li2007p85212007_Ieee_International_Joint_Conference_On_Neural_Networks_Vols_1-6.pdf

Files

Li2007p85212007_Ieee_International_Joint_Conference_On_Neural_Networks_Vols_1-6.pdf

Additional details

Created:
August 19, 2023
Modified:
January 13, 2024