Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory. / Wang, Wei; Wang, Hao; Zhang, Chen; Gao, Yang.
In: Neural Computation, Vol. 30, No. 3, 03.2018, p. 820-855.Research output: Contribution to journal › Article › peer-review
Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory. / Wang, Wei; Wang, Hao; Zhang, Chen; Gao, Yang.
In: Neural Computation, Vol. 30, No. 3, 03.2018, p. 820-855.Research output: Contribution to journal › Article › peer-review
}
TY - JOUR
T1 - Cross-Domain Metric and Multiple Kernel Learning Based on Information Theory
AU - Wang, Wei
AU - Wang, Hao
AU - Zhang, Chen
AU - Gao, Yang
PY - 2018/3
Y1 - 2018/3
N2 - Learning an appropriate distance metric plays a substantial role in the success of many learning machines. Conventional metric learning algorithms have limited utility when the training and test samples are drawn from related but different domains (i.e., source domain and target domain). In this letter, we propose two novel metric learning algorithms for domain adaptation in an information-theoretic setting, allowing for discriminating power transfer and standard learning machine propagation across two domains. In the first one, a cross-domain Mahalanobis distance is learned by combining three goals: reducing the distribution difference between different domains, preserving the geometry of target domain data, and aligning the geometry of source domain data with label information. Furthermore, we devote our efforts to solving complex domain adaptation problems and go beyond linear cross-domain metric learning by extending the first method to a multiple kernel learning framework. A convex combination of multiple kernels and a linear transformation are adaptively learned in a single optimization, which greatly benefits the exploration of prior knowledge and the description of data characteristics. Comprehensive experiments in three real-world applications (face recognition, text classification, and object categorization) verify that the proposed methods outperform state-of-the-art metric learning and domain adaptation methods.
AB - Learning an appropriate distance metric plays a substantial role in the success of many learning machines. Conventional metric learning algorithms have limited utility when the training and test samples are drawn from related but different domains (i.e., source domain and target domain). In this letter, we propose two novel metric learning algorithms for domain adaptation in an information-theoretic setting, allowing for discriminating power transfer and standard learning machine propagation across two domains. In the first one, a cross-domain Mahalanobis distance is learned by combining three goals: reducing the distribution difference between different domains, preserving the geometry of target domain data, and aligning the geometry of source domain data with label information. Furthermore, we devote our efforts to solving complex domain adaptation problems and go beyond linear cross-domain metric learning by extending the first method to a multiple kernel learning framework. A convex combination of multiple kernels and a linear transformation are adaptively learned in a single optimization, which greatly benefits the exploration of prior knowledge and the description of data characteristics. Comprehensive experiments in three real-world applications (face recognition, text classification, and object categorization) verify that the proposed methods outperform state-of-the-art metric learning and domain adaptation methods.
U2 - 10.1162/neco_a_01053
DO - 10.1162/neco_a_01053
M3 - Article
VL - 30
SP - 820
EP - 855
JO - Neural Computation
JF - Neural Computation
SN - 0899-7667
IS - 3
ER -