University of Leicester
Browse

AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language Models

Download (621.58 kB)
conference contribution
posted on 2025-03-26, 12:30 authored by Fuxiang ChenFuxiang Chen, Iman Saberi, Amirreza Esmaeili, Fatemeh Fard

Programming languages can benefit from one an-
other by utilizing a pre-trained model for software engineering
tasks such as code summarization and method name prediction.
While full fine-tuning of Code Language Models (Code-LMs) has
been explored for multilingual knowledge transfer, research on
Parameter Efficient Fine-Tuning (PEFT) for this purpose is lim-
ited. AdapterFusion, a PEFT architecture, aims to enhance task
performance by leveraging information from multiple languages
but primarily focuses on the target language.
To address this, we propose AdvFusion, a novel PEFT-based
approach that effectively learns from other languages before
adapting to the target task. Evaluated on code summarization and
method name prediction, AdvFusion outperforms AdapterFusion
by up to 1.7 points and surpasses LoRA with gains of 1.99, 1.26,
and 2.16 for Ruby, JavaScript, and Go, respectively. We open-
source our scripts for replication purposes1.

History

Author affiliation

College of Science & Engineering Comp' & Math' Sciences

Source

SANER 2025Tue 4 - Fri 7 March 2025 Montréal, Québec, Canada

Version

  • AM (Accepted Manuscript)

Published in

IEEE Conference proceedings SANER 2025

Publisher

IEEE

Copyright date

2025

Available date

2025-03-26

Publisher DOI

Notes

DOP

Language

en

Deposited by

Dr Fuxiang Chen

Deposit date

2025-03-25

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC