Improved leaf area index reconstruction in heavily cloudy areas: A novel deep learning approach for SAR-Optical fusion integrating spatiotemporal features
posted on 2025-08-11, 10:39authored byMingqi Li, Pengxin Wang, Kevin TanseyKevin Tansey, Fengwei Guo, Ji Zhou
The Leaf Area Index (LAI) is an essential parameter for assessing vegetation growth. LAI derived from optical data can suffer from gaps caused by cloud cover. Synthetic Aperture Radar (SAR) presents a solution with its all-weather observation capability. To address these issues, this study proposes a new deep learning approach for reconstructing time series LAI using SAR and optical data in two steps. Firstly, the two-dimensional Convolutional Neural Network-Transformer (2D CNN-Transformer) is applied to bridge SAR and optical data. Secondly, the 2D CNN-Transformer predicted LAI and the Sentinel-2 LAI are input into the Enhanced Deep Convolutional Model for Spatiotemporal Image Fusion (EDCSTFN) model to further improve the accuracy. The novelty lies in a two-step framework combining a 2D CNN-Transformer for spatiotemporal feature extraction and a deep learning fusion algorithm refining accurate LAI reconstruction. Results showed that the 2D CNN-Transformer achieved a higher accuracy (R2 = 0.64, RMSE = 0.38 m<sup>2</sup>/m<sup>2</sup>) in establishing a relationship between SAR and optical data, compared to 1D CNN, 2D CNN-LSTM, and 1D CNN-Transformer. In the second step, the EDCSTFN reconstructed LAI achieved the highest accuracy of an R<sup>2</sup> of 0.81 and an RMSE of 0.22 m<sup>2</sup>/m<sup>2</sup>, with an average R<sup>2</sup> of 0.61 and RMSE of 0.37 m<sup>2</sup>/m<sup>2</sup> across croplands and forests in millions of pixels, further improving the accuracy based on the first step. The approach effectively fills gaps in spatial details and achieves a more continuous spatial distribution. The proposed approach demonstrates good generalizability in millions of pixels under frequent cloud cover and complex surface conditions and provides a new strategy for the fusion of optical and SAR data.<p></p>
Funding
UK Research and Innovation (UKRI) funding from a Science and Technology Facilities Council grant administered through Rothamsted Research under Grant SM008 CAU
National Natural Science Foundation of China under Grant U23A2018
Royal Society-Newton Mobility grant (UK)
History
Author affiliation
College of Science & Engineering
Geography, Geology & Environment
Version
VoR (Version of Record)
Published in
International Journal of Applied Earth Observation and Geoinformation