Split to Learn: Gradient Split for Multi-Task Human Image Analysis

Publication Date: 1/3/2023

Event: WACV23

Reference: pp. 4340-4346, 2023

Authors: Weijian Deng, Australian National University, NEC Laboratories America, Inc.; Yumin Suh, NEC Laboratories America, Inc.; Xiang Yu, NEC Laboratories America, Inc.; Masoud Faraki, NEC Laboratories America, Inc.; Liang Zheng, Australian National University; Manmohan Chandraker, NEC Laboratories America, Inc., UC San Diego

Abstract: This paper presents an approach to train a unified deep network that simultaneously solves multiple human-related tasks. A multi-task framework is favorable for sharing information across tasks under restricted computational resources. However, tasks not only share information but may also compete for resources and conflict with each other, making the optimization of shared parameters difficult and leading to suboptimal performance. We propose a simple but effective training scheme called GradSplit that alleviates this issue by utilizing asymmetric inter-task relations. Specifically, at each convolution module, it splits features into T groups for T tasks and trains each group only using the gradient back-propagated from the task losses with which it does not have conflicts. During training, we apply GradSplit to a series of convolution modules. As a result, each module is trained to generate a set of task-specific features using the shared features from the previous module. This enables a network to use complementary information across tasks while circumventing gradient conflicts. Experimental results show that GradSplit achieves a better accuracy-efficiency trade-off than existing methods. It minimizes accuracy drop caused by task conflicts while significantly saving compute resources in terms of both FLOPs and memory at inference. We further show that GradSplit achieves higher cross-dataset accuracy compared to single-task and other multi-task networks.

Publication Link: https://ieeexplore.ieee.org/document/10030388