For the current REF see the REF 2021 website REF 2021 logo

Output details

11 - Computer Science and Informatics

University College London

Return to search Previous output Next output
Output 0 of 0 in the submission
Output title

Kernel choice and classifiability for RKHS embeddings of probability distributions

Type
E - Conference contribution
DOI
-
Name of conference/published proceedings
Advances in Neural Information Processing Systems 22
Volume number
-
Issue number
-
First page of article
1750
ISSN of proceedings
-
Year of publication
2009
Number of additional authors
4
Additional information

<13>Two-sample testing can be accomplished in high dimensions and on non-Euclidean spaces (distributions over documents or graphs) by representing these distributions in a high dimensional reproducing kernel Hilbert space (feature space). There are infinitely many possible feature spaces, however, and one needs to choose the one which best distinguishes the distributions. This paper shows a good kernel for hypothesis testing optimizes classifiability of the samples, in the process linking for the first time relatively new results on kernel hypothesis testing with classical results on classification. NIPS 2009 talk (awarded to 2% of submissions), runner up for best student paper award.

Interdisciplinary
-
Cross-referral requested
-
Research group
None
Citation count
12
Proposed double-weighted
No
Double-weighted statement
-
Reserve for a double-weighted output
No
Non-English
No
English abstract
-