Header logo is

An algebraic characterization of the optimum of regularized kernel methods

2009

Article

ei


The representer theorem for kernel methods states that the solution of the associated variational problem can be expressed as the linear combination of a finite number of kernel functions. However, for non-smooth loss functions, the analytic characterization of the coefficients poses nontrivial problems. Standard approaches resort to constrained optimization reformulations which, in general, lack a closed-form solution. Herein, by a proper change of variable, it is shown that, for any convex loss function, the coefficients satisfy a system of algebraic equations in a fixed-point form, which may be directly obtained from the primal formulation. The algebraic characterization is specialized to regression and classification methods and the fixed-point equations are explicitly characterized for many loss functions of practical interest. The consequences of the main result are then investigated along two directions. First, the existence of an unconstrained smooth reformulation of the original non-smooth problem is proven. Second, in the context of SURE (Stein’s Unbiased Risk Estimation), a general formula for the degrees of freedom of kernel regression methods is derived.

Author(s): Dinuzzo, F. and De Nicolao, G.
Journal: Machine Learning
Volume: 74
Number (issue): 3
Pages: 315-345
Year: 2009
Month: March
Day: 0

Department(s): Empirical Inference
Bibtex Type: Article (article)

Digital: 0
DOI: 10.1007/s10994-008-5095-1

Links: PDF

BibTex

@article{DinuzzoD2009,
  title = {An algebraic characterization of the optimum of regularized kernel methods},
  author = {Dinuzzo, F. and De Nicolao, G.},
  journal = {Machine Learning},
  volume = {74},
  number = {3},
  pages = {315-345},
  month = mar,
  year = {2009},
  doi = {10.1007/s10994-008-5095-1},
  month_numeric = {3}
}