OPT-CO: Optimizing pre-trained transformer models for efficient COVID-19 classification with stochastic configuration networks
Building upon pre-trained ViT models, many advanced methods have achieved significant success in COVID-19 classification. Many scholars pursue better performance by increasing model complexity and parameters. While these methods can enhance performance, they also require extensive computational resources and extended training times. Additionally, the persistent challenge of overfitting, due to limited COVID-19 dataset sizes, remains a hurdle. To address these challenges, we proposed a novel method to optimize pre-trained transformer models for efficient COVID-19 classification with stochastic configuration networks (SCNs), referred to as OPT-CO. We proposed two optimization methods: sequential optimization (SeOp) and parallel optimization (PaOp), by incorporating optimizers in a sequential and parallel manner, respectively. Our method can enhance model performance without necessitating a significant parameter expansion. Additionally, we introduced OPT-CO-SCN to avoid overfitting problems through the adoption of random projection for head augmentation. The experiments were carried out to evaluate the performance of our proposed model based on two publicly available datasets. Based on the evaluation results, our method achieved superior, performance surpassing other state-of-the-art methods.
Funding
Self-learning AI-based digital twins for accelerating clinical care in respiratory emergency admissions (SLAIDER)
Engineering and Physical Sciences Research Council
Find out more...History
Author affiliation
College of Science & Engineering Comp' & Math' SciencesVersion
- VoR (Version of Record)