For the current REF see the REF 2021 website REF 2021 logo

Output details

11 - Computer Science and Informatics

University of East Anglia

Return to search Previous output Next output
Output 43 of 70 in the submission
Article title

On over-fitting in model selection and subsequent selection bias in performance evaluation

Type
D - Journal article
DOI
-
Title of journal
Journal of Machine Learning Research
Article number
-
Volume number
11
Issue number
-
First page of article
2079
ISSN of journal
1532-4435
Year of publication
2010
URL
-
Number of additional authors
1
Additional information

<24> We demonstrate that over-fitting in model selection, a problem largely neglected in machine learning, significantly degrades the generalisation performance of kernel methods (including Gaussian Processes), and that some widely used performance evaluation protocols exhibit significant optimistic biases as a result. This suggests that improvements in generalisation performance are likely more easily gained by the development of model selection procedures, rather than new learning methods. Counter-intuitively, more recent work reveals this is also a significant issue where only a small number of hyper-parameters are involved. Invited plenary talks were given at IDEAL-2011 and IDA-2012.

Interdisciplinary
-
Cross-referral requested
-
Research group
None
Citation count
22
Proposed double-weighted
No
Double-weighted statement
-
Reserve for a double-weighted output
No
Non-English
No
English abstract
-