Semi-supervised domain adaptation via Fredholm integral based kernel methods

Wei Wang, Hao Wang, Zhaoxiang Zhang, Chen Zhang, Yang Gao

Research output: Contribution to journalArticlepeer-review


Along with the emergence of domain adaptation in semi-supervised setting, dealing with the noisy and complex data in classifier adaptation underscores its growing importance. We believe a large amount of unlabeled data in target domain, which are always only used in distribution alignment, are more of a great source of information for this challenge. In this paper, we propose a novel Transfer Fredholm Multiple Kernel Learning (TFMKL) framework to suppress the noise for complex data distributions. Firstly, with exploring unlabeled target data, TFMKL learns a cross-domain predictive model by developing a Fredholm integral based kernel prediction framework which is proven to be effective in noise suppression. Secondly, TFMKL explicitly extends the applied range of unlabeled target samples into adaptive classifier building and distribution alignment. Thirdly, multiple kernels are explored to induce an optimal learning space. Correspondingly, TFMKL is distinguished with allowing for noise resiliency, facilitating knowledge transfer and analyzing complex data characteristics at the same time. Furthermore, an effective optimization procedure is presented based on the reduced gradient, guaranteeing rapid convergence. We emphasize the adaptability of TFMKL to different domain adaptation tasks due to its extension to different predictive models. In particular, two models based on square loss and hinge loss respectively are proposed within the TFMKL framework. Comprehensive empirical studies on benchmark data sets verify the effectiveness and the noise resiliency of our proposed methods.
Original languageEnglish
Pages (from-to)185-197
Number of pages13
JournalPattern Recognition
Early online date8 Aug 2018
Publication statusPublished - Jan 2019

Cite this