Abstract:Lodging is one of the main factors, which affect the yield and quality of wheat. Timely and accurate acquisition of wheat lodging information is beneficial to cultivating fine varieties and identifying lodging losses in agricultural insurance. A multi-growth stage wheat lodging dataset was constructed based on the visible light UAV remote sensing images of the two growth stages of wheat: grain filling stage and mature stage. By adding different attention modules to the DeepLab v3+ model for comparative analysis, a DeepLab v3+ wheat lodging detection model based on multi-head self-attention was proposed to accurately detect the lodging areas during wheat growth. The experimental results showed that the mPA and mIoU of the proposed multi-head self-attention DeepLab v3+ model were 93.09%, 87.54% (the grain filling stage) and 93.36%, 87.49% (the mature stage), which were improved by 25.45, 7.54, 1.82 (mPA) and 36.15, 11.37, 2.49 (mIoU) percentage points at the grain-filling stage and outperformed by 15.05, 6.32, 0.74 (mPA) and 23.36, 9.82, 0.95 (mIoU) percentage points at the mature stage, compared with the representative SegNet, PSPNet and DeepLab v3+ models, respectively. Secondly, compared with the two attention modules of CBAM and SimAM, DeepLab v3+ based on multi-head self-attention performed the best in both the grain filling stage and the mature stage, and its mPA and mIoU were increased by 1.6, 2.07 and 1.7, 2.45 percentage points at the grain-filling stage and increased by 0.27, 0.11 and 0.26, 0.15 percentage points at the mature stage. The results showed that the improved DeepLab v3+ model captured the lodging features in the UAV remote sensing images of wheat at the grain filling and mature stages effectively and identified the lodging areas in different growth stages precisely, and it had good applicability. It provided a reference for the identification of wheat lodging disaster grades and breeding of improved varieties by using UAV remote sensing technology.