Optimizing 0/1 Loss for Perceptrons by Random Coordinate Descent
- Creators
- Li, Ling
- Lin, Hsuan-Tien
Abstract
The 0/1 loss is an important cost function for perceptrons. Nevertheless it cannot be easily minimized by most existing perceptron learning algorithms. In this paper, we propose a family of random coordinate descent algorithms to directly minimize the 0/1 loss for perceptrons, and prove their convergence. Our algorithms are computationally efficient, and usually achieve the lowest 0/1 loss compared with other algorithms. Such advantages make them favorable for nonseparable real-world problems. Experiments show that our algorithms are especially useful for ensemble learning, and could achieve the lowest test error for many complex data sets when coupled with AdaBoost.
Additional Information
© 2007 IEEE. We wish to thank Yaser Abu-Mostafa and Amrit Pratap for many valuable discussions. This work was mainly supported by the Caltech SISL Graduate Fellowship.Attached Files
Published - Li2007p85212007_Ieee_International_Joint_Conference_On_Neural_Networks_Vols_1-6.pdf
Files
Name | Size | Download all |
---|---|---|
md5:2826a2872d63ade49c3b1929c7e0a063
|
1.1 MB | Preview Download |
Additional details
- Eprint ID
- 20237
- Resolver ID
- CaltechAUTHORS:20100930-143308739
- Caltech SISL Graduate Fellowship
- Created
-
2010-11-11Created from EPrint's datestamp field
- Updated
-
2021-11-08Created from EPrint's last_modified field
- Series Name
- IEEE International Joint Conference on Neural Networks (IJCNN)
- Other Numbering System Name
- INSPEC Accession Number
- Other Numbering System Identifier
- 9797869