Learning with Consistency between Inductive Functions and Kernels: Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems

Haixuan Yang, Irwin King, Michael R. Lyu

Research output: Contribution to conferencePaperpeer-review

41 Downloads (Pure)

Abstract

Regularized Least Squares (RLS) algorithms have the ability to avoid over-fitting problems and to express solutions as kernel expansions. However, we observe that the current RLS algorithms cannot provide a satisfactory interpretation even on the penalty of a constant function. Based on the intuition that a good kernel-based inductive function should be consistent with both the data and the kernel, a novel learning scheme is proposed. The advantages of this scheme lie in its corresponding Representer Theorem, its strong interpretation ability about what kind of functions should not be penalized, and its promising accuracy improvements shown in a number of experiments. Furthermore, we provide a detailed technical description about heat kernels, which serves as an example for the readers to apply similar techniques for other kernels. Our work provides a preliminary step in a new direction to explore the varying consistency between inductive functions and kernels under various distributions.
Original languageEnglish
Pages1849-1856
Number of pages8
Publication statusPublished - 2008
EventAdvances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems - Vancouver, Canada
Duration: 8 Dec 200811 Dec 2008

Conference

ConferenceAdvances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems
Country/TerritoryCanada
CityVancouver
Period8/12/0811/12/08

Keywords

  • Regularized Least Square
  • Kernels

Cite this