On Generalizing Beyond Domains in Cross-Domain Continual Learning

Publication Date: 6/19/2022

Event: CVPR’22

Reference: pp. 9255-9264, 2022

Authors: Christian Simon, NEC Laboratories America, Inc., The Australian National University, ACT, Australia; Masoud Faraki, NEC Laboratories America, Inc.; Yi-Hsuan Tsai, NEC Laboratories America, Inc.; Xiang Yu, NEC Laboratories America, Inc.; Samuel Schulter, NEC Laboratories America, Inc.; Yumin Suh, NEC Laboratories America, Inc.; Mehrtash Harandi, Monash University, VIC, Australia; Manmohan Chandraker, NEC Laboratories America, Inc.

Abstract: Humans have the ability to accumulate knowledge of new tasks in varying conditions, but deep neural networks of-ten suffer from catastrophic forgetting of previously learned knowledge after learning a new task. Many recent methods focus on preventing catastrophic forgetting under the assumption of train and test data following similar distributions. In this work, we consider a more realistic scenario of continual learning under domain shifts where the model must generalize its inference to an unseen domain. To this end, we encourage learning semantically meaningful features by equipping the classifier with class similarity metrics as learning parameters which are obtained through Mahalanobis similarity computations. Learning of the backbone representation along with these extra parameters is done seamlessly in an end-to-end manner. In addition, we propose an approach based on the exponential moving average of the parameters for better knowledge distillation. We demonstrate that, to a great extent, existing continual learning algorithms fail to handle the forgetting issue under multiple distributions, while our proposed approach learns new tasks under domain shift with accuracy boosts up to 10% on challenging datasets such as DomainNet and OfficeHome.

Publication Link: https://ieeexplore.ieee.org/document/9879030

Additional Publication Link: https://arxiv.org/pdf/2203.03970.pdf