Semi-supervised domain adaptation via Fredholm integral based kernel methods. / Wang, Wei; Wang, Hao; Zhang, Chen; Gao, Yang.

In: Pattern Recognition, Vol. 85, 01.2019, p. 185-197.

Research output: Contribution to journalArticle

Published

Standard

Semi-supervised domain adaptation via Fredholm integral based kernel methods. / Wang, Wei; Wang, Hao; Zhang, Chen; Gao, Yang.

In: Pattern Recognition, Vol. 85, 01.2019, p. 185-197.

Research output: Contribution to journalArticle

Harvard

APA

Vancouver

Author

Wang, Wei ; Wang, Hao ; Zhang, Chen ; Gao, Yang. / Semi-supervised domain adaptation via Fredholm integral based kernel methods. In: Pattern Recognition. 2019 ; Vol. 85. pp. 185-197.

BibTeX

@article{97676a77a02e4cf49c6a13c51d56d489,
title = "Semi-supervised domain adaptation via Fredholm integral based kernel methods",
abstract = "Along with the emergence of domain adaptation in semi-supervised setting, dealing with the noisy and complex data in classifier adaptation underscores its growing importance. We believe a large amount of unlabeled data in target domain, which are always only used in distribution alignment, are more of a great source of information for this challenge. In this paper, we propose a novel Transfer Fredholm Multiple Kernel Learning (TFMKL) framework to suppress the noise for complex data distributions. Firstly, with exploring unlabeled target data, TFMKL learns a cross-domain predictive model by developing a Fredholm integral based kernel prediction framework which is proven to be effective in noise suppression. Secondly, TFMKL explicitly extends the applied range of unlabeled target samples into adaptive classifier building and distribution alignment. Thirdly, multiple kernels are explored to induce an optimal learning space. Correspondingly, TFMKL is distinguished with allowing for noise resiliency, facilitating knowledge transfer and analyzing complex data characteristics at the same time. Furthermore, an effective optimization procedure is presented based on the reduced gradient, guaranteeing rapid convergence. We emphasize the adaptability of TFMKL to different domain adaptation tasks due to its extension to different predictive models. In particular, two models based on square loss and hinge loss respectively are proposed within the TFMKL framework. Comprehensive empirical studies on benchmark data sets verify the effectiveness and the noise resiliency of our proposed methods.",
author = "Wei Wang and Hao Wang and Chen Zhang and Yang Gao",
year = "2019",
month = "1",
doi = "10.1016/j.patcog.2018.07.035",
language = "English",
volume = "85",
pages = "185--197",
journal = "Pattern Recognition",
issn = "0031-3203",
publisher = "Elsevier Limited",

}

RIS

TY - JOUR

T1 - Semi-supervised domain adaptation via Fredholm integral based kernel methods

AU - Wang, Wei

AU - Wang, Hao

AU - Zhang, Chen

AU - Gao, Yang

PY - 2019/1

Y1 - 2019/1

N2 - Along with the emergence of domain adaptation in semi-supervised setting, dealing with the noisy and complex data in classifier adaptation underscores its growing importance. We believe a large amount of unlabeled data in target domain, which are always only used in distribution alignment, are more of a great source of information for this challenge. In this paper, we propose a novel Transfer Fredholm Multiple Kernel Learning (TFMKL) framework to suppress the noise for complex data distributions. Firstly, with exploring unlabeled target data, TFMKL learns a cross-domain predictive model by developing a Fredholm integral based kernel prediction framework which is proven to be effective in noise suppression. Secondly, TFMKL explicitly extends the applied range of unlabeled target samples into adaptive classifier building and distribution alignment. Thirdly, multiple kernels are explored to induce an optimal learning space. Correspondingly, TFMKL is distinguished with allowing for noise resiliency, facilitating knowledge transfer and analyzing complex data characteristics at the same time. Furthermore, an effective optimization procedure is presented based on the reduced gradient, guaranteeing rapid convergence. We emphasize the adaptability of TFMKL to different domain adaptation tasks due to its extension to different predictive models. In particular, two models based on square loss and hinge loss respectively are proposed within the TFMKL framework. Comprehensive empirical studies on benchmark data sets verify the effectiveness and the noise resiliency of our proposed methods.

AB - Along with the emergence of domain adaptation in semi-supervised setting, dealing with the noisy and complex data in classifier adaptation underscores its growing importance. We believe a large amount of unlabeled data in target domain, which are always only used in distribution alignment, are more of a great source of information for this challenge. In this paper, we propose a novel Transfer Fredholm Multiple Kernel Learning (TFMKL) framework to suppress the noise for complex data distributions. Firstly, with exploring unlabeled target data, TFMKL learns a cross-domain predictive model by developing a Fredholm integral based kernel prediction framework which is proven to be effective in noise suppression. Secondly, TFMKL explicitly extends the applied range of unlabeled target samples into adaptive classifier building and distribution alignment. Thirdly, multiple kernels are explored to induce an optimal learning space. Correspondingly, TFMKL is distinguished with allowing for noise resiliency, facilitating knowledge transfer and analyzing complex data characteristics at the same time. Furthermore, an effective optimization procedure is presented based on the reduced gradient, guaranteeing rapid convergence. We emphasize the adaptability of TFMKL to different domain adaptation tasks due to its extension to different predictive models. In particular, two models based on square loss and hinge loss respectively are proposed within the TFMKL framework. Comprehensive empirical studies on benchmark data sets verify the effectiveness and the noise resiliency of our proposed methods.

U2 - 10.1016/j.patcog.2018.07.035

DO - 10.1016/j.patcog.2018.07.035

M3 - Article

VL - 85

SP - 185

EP - 197

JO - Pattern Recognition

JF - Pattern Recognition

SN - 0031-3203

ER -