AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language Models
Programming languages can benefit from one an-
other by utilizing a pre-trained model for software engineering
tasks such as code summarization and method name prediction.
While full fine-tuning of Code Language Models (Code-LMs) has
been explored for multilingual knowledge transfer, research on
Parameter Efficient Fine-Tuning (PEFT) for this purpose is lim-
ited. AdapterFusion, a PEFT architecture, aims to enhance task
performance by leveraging information from multiple languages
but primarily focuses on the target language.
To address this, we propose AdvFusion, a novel PEFT-based
approach that effectively learns from other languages before
adapting to the target task. Evaluated on code summarization and
method name prediction, AdvFusion outperforms AdapterFusion
by up to 1.7 points and surpasses LoRA with gains of 1.99, 1.26,
and 2.16 for Ruby, JavaScript, and Go, respectively. We open-
source our scripts for replication purposes1.
History
Author affiliation
College of Science & Engineering Comp' & Math' SciencesSource
SANER 2025Tue 4 - Fri 7 March 2025 Montréal, Québec, CanadaVersion
- AM (Accepted Manuscript)