Position Really Matters: Towards a Holistic Approach for Prompt Tuning
Publication Date: 4/30/2025
Event: 2025 Annual Conference of the Nations of the Americas Chapter of the Association for Computational Linguistics (NAACL 2025)
Reference: pp. 1-23, 2025
Authors: Xianjun Yang, University of California, Santa Barbara; Wei Cheng, NEC Laboratories America, Inc.; Xujiang Zhao, NEC Laboratories America, Inc.; Wenchao Yu, NEC Laboratories America, Inc.; Linda Ruth Petzold, University of California, Santa Barbara; Haifeng Chen, NEC Laboratories America, Inc.
Abstract: Prompt tuning is highly effective in efficiently extracting knowledge from foundation models, encompassing both language, vision, and vision-language models. However, the efficacy of employing fixed soft prompts with a predetermined position for concatenation with inputs for all instances, irrespective of their inherent disparities, remains uncertain. Variables such as the position, length, and representations of prompts across diverse instances and tasks can substantially influence the performance of prompt tuning. We first provide a theoretical analysis, revealing that optimizing the position of the prompt to encompass the input can capture additional semantic information that traditional prefix or postfix prompt tuning methods fail to capture. Then, we present a holistic parametric prompt tuning strategy that dynamically determines different factors of prompts based on specific tasks or instances. Experimental results underscore the significant performance improvement achieved by dynamic prompt tuning across a wide range of tasks, including NLP, vision recognition, and vision-language tasks. Furthermore, we establish the universal applicability of our approach under full-data, few-shot, and multitask settings.
Publication Link: