Deep Content and Contrastive Perception Learning for Automatic Fetal Nuchal Translucency Image Quality Assessment
Automatic quality assessment of fetal nuchal translucency ultrasound images can assist physicians in obtaining standard planes and improve the reproducibility of nuchal translucency screening. At present, there are no special studies and methods for the quality assessment of fetal nuchal translucency ultrasound images. For this task, main challenges are low image quality, content identification of structural integrity and relative position relationship, time consumption for data collection and fine-grained annotation. To address these challenges, we propose a framework based on DenseNet model, which includes preprocessing module, content perception module, attention learning module and contrastive regularization module. Experiments show that the modules are effective for improving the quality assessment framework performance. And this framework is better than the other fourteen deep learning models. This framework can provide the sonographer with a model interpretable reference map. Bland–Altman experimental analysis also verifies the consistency between the results obtained by the automatic quality assessment framework and the manually annotated clinical dataset. Therefore, the proposed quality assessment framework for fetal nuchal translucency ultrasound images has the prospect and value of clinical application.
Funding
This work is supported by the National Natural Sciences Foundation of China under Grant 61976120, Nantong Science Technology Program Project MS2023060, Nantong Institute of Technology Foundation 2023XK(B)03 and Jiangsu Government Scholarship.
History
Author affiliation
College of Science & Engineering Comp' & Math' SciencesVersion
- AM (Accepted Manuscript)