Output details
11 - Computer Science and Informatics
University of East Anglia
Efficient approximate leave-one-out cross-validation for kernel logistic regression
<24> Careful tuning of the regularisation and kernel parameters is vital in maximising generalisation performance of kernel learning methods. We provide a computationally inexpensive method for tuning kernel logistic regression models that is demonstrated to be competitive with Bayesian approaches. This method has been implemented in Cawley's publically-downloadable Generalised Kernel Machine toolbox for MATLAB. The efficient model selection procedure proposed in this paper means that kernel logistic regression becomes a practical alternative to the support vector machine in applications where estimates of a-posteriori probability of class membership are required, rather than a purely discriminative classification.