Abstract
Using methods of Statistical Physics, we investigate the generalization performance of support vector machines (SVMs), which have been recently introduced as a general alternative to neural networks. For nonlinear classification rules, the generalization error saturates on a plateau, when the number of examples is too small to properly estimate the coefficients of the nonlinear part. When trained on simple rules, we find that SVMs overfit only weakly. The performance of SVMs is strongly enhanced, when the distribution of the inputs has a gap in feature space.
Original language | English |
---|---|
Pages (from-to) | 2975-2978 |
Number of pages | 4 |
Journal | Physical Review Letters |
Volume | 82 |
Issue number | 14 |
DOIs | |
Publication status | Published - 5 Apr 1999 |
Bibliographical note
Copyright of the American Physical SocietyKeywords
- statistical physics
- support vector machines
- neural networks
- nonlinear classification
- generalization error