Personalized Federated Learning via Heterogeneous Modular Networks

Publication Date: 12/3/2022

Event: IEEE ICDM 2022 – 22nd IEEE International Conference on Data Mining, Orlando, FL

Reference: pp. 1-6, 2022

Authors: Tianchun Wang, Pennsylvania State University; Wei Cheng, NEC Laboratories America, Inc.; Dongsheng Luo, Florida International University; Wenchao Yu, NEC Laboratories America, Inc.; Jingchao Ni, Amazon Web Services (AWS); Liang Tong, Stellar Cyber; Haifeng Chen, NEC Laboratories America, Inc.; Xiang Zhang, Pennsylvania State University

Abstract: Personalized Federated Learning (PFL) which collaboratively trains a federated model while considering local clients under privacy constraints has attracted much attention. Despite its popularity, it has been observed that existing PFL approaches result in sub-optimal solutions when the joint distribution among local clients diverges. To address this issue, we present Federated Modular Network (FedMN), a novel PFL approach that adaptively selects sub-modules from a module pool to assemble heterogeneous neural architectures for different clients. FedMN adopts a light-weighted routing hypernetwork to model the joint distribution on each client and produce the personalized selection of the module blocks for each client. To reduce the communication burden in existing FL, we develop an efficient way to interact between the clients and the server. We conduct extensive experiments on the real-world test beds and the results show both effectiveness and efficiency of the proposed FedMN over the baselines.

Publication Link: