University of Leicester
Browse

AdvFusion: Adapter-based Knowledge Transfer for Code Summarization on Code Language Models

Download (621.58 kB)
Version 2 2025-09-02, 15:25
Version 1 2025-03-26, 12:30
conference contribution
posted on 2025-03-26, 12:30 authored by Fuxiang ChenFuxiang Chen, Iman Saberi, Amirreza Esmaeili, Fatemeh Fard
<p dir="ltr">Programming languages can benefit from one an-<br>other by utilizing a pre-trained model for software engineering<br>tasks such as code summarization and method name prediction.<br>While full fine-tuning of Code Language Models (Code-LMs) has<br>been explored for multilingual knowledge transfer, research on<br>Parameter Efficient Fine-Tuning (PEFT) for this purpose is lim-<br>ited. AdapterFusion, a PEFT architecture, aims to enhance task<br>performance by leveraging information from multiple languages<br>but primarily focuses on the target language.<br>To address this, we propose AdvFusion, a novel PEFT-based<br>approach that effectively learns from other languages before<br>adapting to the target task. Evaluated on code summarization and<br>method name prediction, AdvFusion outperforms AdapterFusion<br>by up to 1.7 points and surpasses LoRA with gains of 1.99, 1.26,<br>and 2.16 for Ruby, JavaScript, and Go, respectively. We open-<br>source our scripts for replication purposes1.</p>

History

Related Materials

Author affiliation

College of Science & Engineering Comp' & Math' Sciences

Source

SANER 2025Tue 4 - Fri 7 March 2025 Montréal, Québec, Canada

Version

  • AM (Accepted Manuscript)

Published in

IEEE Conference proceedings SANER 2025

Publisher

IEEE

Copyright date

2025

Available date

2025-03-26

Publisher DOI

Notes

DOP

Language

en

Deposited by

Dr Fuxiang Chen

Deposit date

2025-03-25

Usage metrics

    University of Leicester Publications

    Categories

    No categories selected

    Licence

    Exports

    RefWorks
    BibTeX
    Ref. manager
    Endnote
    DataCite
    NLM
    DC