Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory

Wei Wang, Hao Wang, Chen Zhang, Yang Gao

Research output: Contribution to journalArticlepeer-review

Abstract

Learning an appropriate distance metric plays a substantial role in the success of many learning machines. Conventional metric learning algorithms have limited utility when the training and test samples are drawn from related but different domains (i.e., source domain and target domain). In this letter, we propose two novel metric learning algorithms for domain adaptation in an information-theoretic setting, allowing for discriminating power transfer and standard learning machine propagation across two domains. In the first one, a cross-domain Mahalanobis distance is learned by combining three goals: reducing the distribution difference between different domains, preserving the geometry of target domain data, and aligning the geometry of source domain data with label information. Furthermore, we devote our efforts to solving complex domain adaptation problems and go beyond linear cross-domain metric learning by extending the first method to a multiple kernel learning framework. A convex combination of multiple kernels and a linear transformation are adaptively learned in a single optimization, which greatly benefits the exploration of prior knowledge and the description of data characteristics. Comprehensive experiments in three real-world applications (face recognition, text classification, and object categorization) verify that the proposed methods outperform state-of-the-art metric learning and domain adaptation methods.
Original languageEnglish
Pages (from-to)820-855
Number of pages36
JournalNeural Computation
Volume30
Issue number3
Early online date15 Feb 2018
DOIs
Publication statusPublished - Mar 2018

Cite this