CGBoost: Conjugate Gradient in Function Space
- Creators
- Li, Ling
- Abu-Mostafa, Yaser S.
- Pratap, Amrit
Abstract
The superior out-of-sample performance of AdaBoost has been attributed to the fact that it minimizes a cost function based on margin, in that it can be viewed as a special case of AnyBoost, an abstract gradient descent algorithm. In this paper, we provide a more sophisticated abstract boosting algorithm, CGBoost, based on conjugate gradient in function space. When the AdaBoost exponential cost function is optimized, CGBoost generally yields much lower cost and training error but higher test error, which implies that the exponential cost is vulnerable to overfitting. With the optimization power of CGBoost, we can adopt more "regularized" cost functions that have better out-of-sample performance but are difficult to optimize. Our experiments demonstrate that CGBoost generally outperforms AnyBoost in cost reduction. With suitable cost functions, CGBoost can have better out-of-sample performance.
Files
Name | Size | Download all |
---|---|---|
md5:52620c5d1c929a71e676c7f9d75a10ba
|
465.6 kB | Download |
Additional details
- Eprint ID
- 27069
- Resolver ID
- CaltechCSTR:2003.007
- Created
-
2003-08-29Created from EPrint's datestamp field
- Updated
-
2019-10-03Created from EPrint's last_modified field
- Caltech groups
- Computer Science Technical Reports