Adversarial Alignment for Source Free Object Detection

Publication Date: 2/7/2023

Event: AAAI 2023

Reference: pp. 1-9, 2023

Authors: Qiaosong Chu, Tsinghua University; Shuyan Li, Tsinghua University; Guangyi Chen, MBZUAI; Kai Li, NEC Laboratories America, Inc.; Xiu Li, Tsinghua University

Abstract: Source-free object detection (SFOD) aims to transfer a detector pre-trained on a label-rich source domain to an unlabeled target domain without seeing source data. While most existing SFOD methods generate pseudo labels via a source-pretrained model to guide training, these pseudo labels usually contain high noises due to heavy domain discrepancy. In order to obtain better pseudo supervisions, we divide the target domain into source-similar and source-dissimilar parts and align them in the feature space by adversarial learning. Specifically, we design a detection variance-based criterion to divide the target domain. This criterion is motivated by a finding that larger detection variances denote higher recall and larger similarity to the source domain. Then we incorporate an adversarial module into a mean teacher framework to drive the feature spaces of these two subsets indistinguishable. Extensive experiments on multiple cross-domain object detection datasets demonstrate that our proposed method consistently outperforms the compared SFOD methods. Our implementation is available at https://github.com/ChuQiaosong

Publication Link: https://ojs.aaai.org/index.php/AAAI/article/view/25119