Unsupervised Domain Adaptation for Distance Metric Learning
Publication Date: 5/6/2019
Event: Seventh International Conference on Learning Representations (ICLR 2019)
Reference: pp. 1-18, 2019
Authors: Kihyuk Sohn, NEC Laboratories America, Inc.; Wenling Shang, University of Amsterdam; Xiang Yu, NEC Laboratories America, Inc.; Manmohan Chandraker, NEC Laboratories America, Inc.
Abstract: Unsupervised domain adaptation is a promising avenue to enhance the performance of deep neural networks on a target domain, using labels only from a source domain. However, the two predominant methods, domain discrepancy reduction learning and semi-supervised learning, are not readily applicable when source and target domains do not share a common label space. This paper addresses the above scenario by learning a representation space that retains discriminative power on both the (labeled) source and (unlabeled) target domains while keeping representations for the two domains well-separated. Inspired by a theoretical analysis, we first reformulate the disjoint classification task, where the source and target domains correspond to non-overlapping class labels, to a verification one. To handle both within and cross domain verifications, we propose a Feature Transfer Network (FTN) to separate the target feature space from the original source space while aligned with a transformed source space. Moreover, we present a non-parametric multi-class entropy minimization loss to further boost the discriminative power of FTNs on the target domain. In experiments, we first illustrate how FTN works in a controlled setting of adapting from MNIST-M to MNIST with disjoint digit classes between the two domains and then demonstrate the effectiveness of FTNs through state-of-the-art performances on a cross-ethnicity face recognition problem.
Publication Link: https://openreview.net/pdf?id=BklhAj09K7