Learning with Consistency between Inductive Functions and Kernels : Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems. / Yang, Haixuan; King, Irwin; Lyu, Michael R.

2008. 1849-1856 Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada.

Research output: Contribution to conferencePaperpeer-review

Published

Standard

Learning with Consistency between Inductive Functions and Kernels : Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems. / Yang, Haixuan; King, Irwin; Lyu, Michael R.

2008. 1849-1856 Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada.

Research output: Contribution to conferencePaperpeer-review

Harvard

Yang, H, King, I & Lyu, MR 2008, 'Learning with Consistency between Inductive Functions and Kernels: Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems', Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada, 8/12/08 - 11/12/08 pp. 1849-1856.

APA

Yang, H., King, I., & Lyu, M. R. (2008). Learning with Consistency between Inductive Functions and Kernels: Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems. 1849-1856. Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada.

Vancouver

Yang H, King I, Lyu MR. Learning with Consistency between Inductive Functions and Kernels: Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems. 2008. Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada.

Author

Yang, Haixuan ; King, Irwin ; Lyu, Michael R. / Learning with Consistency between Inductive Functions and Kernels : Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems. Paper presented at Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems, Vancouver, Canada.8 p.

BibTeX

@conference{6ecc2cd97c4c42f58817ad059a6645d2,
title = "Learning with Consistency between Inductive Functions and Kernels: Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems",
abstract = "Regularized Least Squares (RLS) algorithms have the ability to avoid over-fitting problems and to express solutions as kernel expansions. However, we observe that the current RLS algorithms cannot provide a satisfactory interpretation even on the penalty of a constant function. Based on the intuition that a good kernel-based inductive function should be consistent with both the data and the kernel, a novel learning scheme is proposed. The advantages of this scheme lie in its corresponding Representer Theorem, its strong interpretation ability about what kind of functions should not be penalized, and its promising accuracy improvements shown in a number of experiments. Furthermore, we provide a detailed technical description about heat kernels, which serves as an example for the readers to apply similar techniques for other kernels. Our work provides a preliminary step in a new direction to explore the varying consistency between inductive functions and kernels under various distributions.",
keywords = "Regularized Least Square, Kernels",
author = "Haixuan Yang and Irwin King and Lyu, {Michael R.}",
year = "2008",
language = "English",
pages = "1849--1856",
note = "Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems ; Conference date: 08-12-2008 Through 11-12-2008",

}

RIS

TY - CONF

T1 - Learning with Consistency between Inductive Functions and Kernels

T2 - Advances in Neural Information Processing Systems 21, Proceedings of the Twenty-Second Annual Conference on Neural Information Processing Systems

AU - Yang, Haixuan

AU - King, Irwin

AU - Lyu, Michael R.

PY - 2008

Y1 - 2008

N2 - Regularized Least Squares (RLS) algorithms have the ability to avoid over-fitting problems and to express solutions as kernel expansions. However, we observe that the current RLS algorithms cannot provide a satisfactory interpretation even on the penalty of a constant function. Based on the intuition that a good kernel-based inductive function should be consistent with both the data and the kernel, a novel learning scheme is proposed. The advantages of this scheme lie in its corresponding Representer Theorem, its strong interpretation ability about what kind of functions should not be penalized, and its promising accuracy improvements shown in a number of experiments. Furthermore, we provide a detailed technical description about heat kernels, which serves as an example for the readers to apply similar techniques for other kernels. Our work provides a preliminary step in a new direction to explore the varying consistency between inductive functions and kernels under various distributions.

AB - Regularized Least Squares (RLS) algorithms have the ability to avoid over-fitting problems and to express solutions as kernel expansions. However, we observe that the current RLS algorithms cannot provide a satisfactory interpretation even on the penalty of a constant function. Based on the intuition that a good kernel-based inductive function should be consistent with both the data and the kernel, a novel learning scheme is proposed. The advantages of this scheme lie in its corresponding Representer Theorem, its strong interpretation ability about what kind of functions should not be penalized, and its promising accuracy improvements shown in a number of experiments. Furthermore, we provide a detailed technical description about heat kernels, which serves as an example for the readers to apply similar techniques for other kernels. Our work provides a preliminary step in a new direction to explore the varying consistency between inductive functions and kernels under various distributions.

KW - Regularized Least Square

KW - Kernels

M3 - Paper

SP - 1849

EP - 1856

Y2 - 8 December 2008 through 11 December 2008

ER -