亚洲一区欧美在线,日韩欧美视频免费观看,色戒的三场床戏分别是在几段,欧美日韩国产在线人成

基于深度學(xué)習(xí)的無人機遙感小麥倒伏面積提取方法
作者:
作者單位:

作者簡介:

通訊作者:

中圖分類號:

基金項目:

河南省科技攻關(guān)計劃項目(212102110253、222102110244)、國家自然科學(xué)基金項目(62072160)、河南省農(nóng)業(yè)科學(xué)院農(nóng)業(yè)經(jīng)濟與信息研究所科技創(chuàng)新領(lǐng)軍人才培育項目(2022KJCX02)和河南省農(nóng)業(yè)科學(xué)院科技創(chuàng)新團隊項目(2022TD14)


Extraction of Lodging Area of Wheat Varieties by Unmanned Aerial Vehicle Remote Sensing Based on Deep Learning
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計
  • |
  • 參考文獻
  • |
  • 相似文獻
  • |
  • 引證文獻
  • |
  • 資源附件
  • |
  • 文章評論
    摘要:

    為及時準確地提取小麥倒伏面積,提出一種融合多尺度特征的倒伏面積分割模型Attention_U2-Net。該模型以U2-Net為架構(gòu),利用非局部注意力(Non-local attention)機制替換步長較大的空洞卷積,擴大高層網(wǎng)絡(luò)感受野,提高不同尺寸地物識別準確率;使用通道注意力機制改進級聯(lián)方式提升模型精度;構(gòu)建多層級聯(lián)合加權(quán)損失函數(shù),用于解決均衡難易度和正負樣本不均衡問題。Attention_U2-Net在自建數(shù)據(jù)集上采用裁剪方式提取小麥倒伏面積,查準率為86.53%,召回率為89.42%,F(xiàn)1值為87.95%。與FastFCN、U-Net、U2-Net、FCN、SegNet、DeepLabv3等模型相比,Attention_U2-Net具有最高的F1值。通過與標注面積對比,Attention_U2-Net使用裁剪方式提取面積與標注面積最為接近,倒伏面積準確率可達97.25%,且誤檢面積最小。實驗結(jié)果表明,Attention_U2-Net對小麥倒伏面積提取具有較強的魯棒性和準確率,可為無人機遙感小麥受災(zāi)面積及評估損失提供參考。

    Abstract:

    In order to extract the lodging area timely and accurately, a lodging area extraction model, namely Attention_U2-Net, was proposed. By integrating multi-scale features and based on U2-Net, Attention_U2-Net employed non-local attention mechanism to replace the hole convolution with large step size, expanded the receptive field of high-level network and improved the recognition accuracy of ground objects with different sizes, and utilized channel attention mechanism to improve the cascade mode and enhanced the accuracy. A multi-level joint weighted loss function was designed to balance the difficult and easy samples, and solve the challenge of imbalance between positive and negative samples. Patch-based pipelines were utilized to extract the lodging area. Experimental results on the self-built dataset showed effectiveness of Attention_U2-Net. The precision rate was 86.53%, the recall rate was 89.42%, and the F1 value was 87.95%, respectively. Compared with FastFCN, U-Net, U2-Net, FCN, SegNet and DeepLabv3, Attention_U2-Net achieved the highest F1 value and showed strong robustness and extraction accuracy. Compared with the labeled area, the extracted area obtained by Attention_U2-Net via cropping method was the closest one, and the accuracy rate of lodging area can reach 97.25%. Meanwhile, the false detection area of Attention_U2-Net was the smallest among all models. Experimental results showed that Attention_U2-Net had strong robustness and high segmentation accuracy, which can be utilized as a valuable reference for UAV remote sensing of wheat affected area and loss assessment.

    參考文獻
    相似文獻
    引證文獻
引用本文

申華磊,蘇歆琪,趙巧麗,周萌,劉棟,臧賀藏.基于深度學(xué)習(xí)的無人機遙感小麥倒伏面積提取方法[J].農(nóng)業(yè)機械學(xué)報,2022,53(9):252-260,341. SHEN Hualei, SU Xinqi, ZHAO Qiaoli, ZHOU Meng, LIU Dong, ZANG Hecang. Extraction of Lodging Area of Wheat Varieties by Unmanned Aerial Vehicle Remote Sensing Based on Deep Learning[J]. Transactions of the Chinese Society for Agricultural Machinery,2022,53(9):252-260,341.

復(fù)制
分享
文章指標
  • 點擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2022-04-16
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2022-09-10
  • 出版日期:
文章二維碼