Abstract
We derive a mean field algorithm for binary classification with Gaussian processes which is based on the TAP approach originally proposed in Statistical Physics of disordered systems. The theory also yields an approximate leave-one-out estimator for the generalization error which is computed with no extra computational cost. We show that from the TAP approach, it is possible to derive both a simpler 'naive' mean field theory and support vector machines (SVM) as limiting cases. For both mean field algorithms and support vectors machines, simulation results for three small benchmark data sets are presented. They show 1. that one may get state of the art performance by using the leave-one-out estimator for model selection and 2. the built-in leave-one-out estimators are extremely precise when compared to the exact leave-one-out estimate. The latter result is a taken as a strong support for the internal consistency of the mean field approach.
Original language | English |
---|---|
Pages (from-to) | 2655-2684 |
Number of pages | 30 |
Journal | Neural Computation |
Volume | 12 |
Issue number | 11 |
DOIs | |
Publication status | Published - Nov 2000 |
Keywords
- mean field algorithm
- binary classification
- Gaussian processes
- TAP approach
- Statistical Physics
- disordered systems
- estimator
- generalization error
- computational cost
- support vector machines (SVM)