Header logo is

Supervised Feature Selection via Dependence Estimation

2007

Conference Paper

ei


We introduce a framework for filtering features that employs the Hilbert-Schmidt Independence Criterion (HSIC) as a measure of dependence between the features and the labels. The key idea is that good features should maximise such dependence. Feature selection for various supervised learning problems (including classification and regression) is unified under this framework, and the solutions can be approximated using a backward-elimination algorithm. We demonstrate the usefulness of our method on both artificial and real world datasets.

Author(s): Song, L. and Smola, AJ. and Gretton, A. and Borgwardt, KM. and Bedo, J.
Journal: Proceedings of the 24th Annual International Conference on Machine Learning (ICML 2007)
Pages: 823-830
Year: 2007
Month: June
Day: 0
Editors: Ghahramani, Z.
Publisher: ACM Press

Department(s): Empirical Inference
Bibtex Type: Conference Paper (inproceedings)

DOI: 10.1145/1273496.1273600
Event Name: Twenty-Fourth Annual International Conference on Machine Learning (ICML 2007)
Event Place: Corvallis, OR, USA

Address: New York, NY, USA
Digital: 0
ISBN: 978-1-59593-793-3
Language: en
Organization: Max-Planck-Gesellschaft
School: Biologische Kybernetik

Links: PDF
Web

BibTex

@inproceedings{4462,
  title = {Supervised Feature Selection via Dependence Estimation},
  author = {Song, L. and Smola, AJ. and Gretton, A. and Borgwardt, KM. and Bedo, J.},
  journal = {Proceedings of the 24th Annual International Conference on Machine Learning (ICML 2007)},
  pages = {823-830},
  editors = {Ghahramani, Z. },
  publisher = {ACM Press},
  organization = {Max-Planck-Gesellschaft},
  school = {Biologische Kybernetik},
  address = {New York, NY, USA},
  month = jun,
  year = {2007},
  doi = {10.1145/1273496.1273600},
  month_numeric = {6}
}