Beyond the Permutation Symmetry of Transformers: The Role of Rotation for Model Fusion

Publication Date: 7/13/2025

Event: Forty-Second International Conference on Machine Learning (ICML 2025)

Reference: pp. 1-17, 2025

Authors: Binchi Zhang, University of Virginia; Zaiyi Zheng, University of Virginia; Zhengzhang Chen, NEC Laboratories America, Inc.; Jundong Li, University of Virginia

Abstract: Symmetry in the parameter space of deep neural networks (DNNs) has proven beneficial for various deep learning applications. A well-known example is the permutation symmetry in Multi-Layer Perceptrons (MLPs), where permuting the rows of weight matrices in one layer and applying the inverse permutation to adjacent layers yields a functionally equivalent model. While permutation symmetry fully characterizes the equivalence set for MLPs, its discrete nature limits its utility for transformers. In this paper, we introduce rotation symmetry, a novel form of parameter space symmetry for transformers that generalizes permutation symmetry by rotating parameter matrices in self-attention layers. Unlike permutation symmetry, rotation symmetry operates in a continuous domain, thereby significantly expanding the equivalence set for transformers. Based on this property, we propose a theoretically optimal parameter matching algorithm as a plug-and-play module to enhance model fusion. We evaluate our approach using pre-trained transformers across diverse natural language and vision tasks. Experimental results demonstrate that our rotation symmetry based matching algorithm substantially improves model fusion, highlighting the potential of parameter space symmetry to facilitate model fusion. Our code is available on https://github.com/zhengzaiyi/RotationSymmetry.

Publication Link: https://icml.cc/virtual/2025/poster/43634