亚洲一区欧美在线,日韩欧美视频免费观看,色戒的三场床戏分别是在几段,欧美日韩国产在线人成

基于深度圖像和神經(jīng)網(wǎng)絡(luò)的拖拉機(jī)識(shí)別與定位方法
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類號(hào):

基金項(xiàng)目:

國(guó)家重點(diǎn)研發(fā)計(jì)劃項(xiàng)目(2017YFD0700400-2017YFD0700403)


Tractor Identification and Positioning Method Based on Depth Image and Neural Network
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    針對(duì)多機(jī)協(xié)同導(dǎo)航作業(yè)中本機(jī)前方的拖拉機(jī)識(shí)別精度低、相對(duì)定位困難,難以保障自主作業(yè)安全的問題,提出了一種基于深度圖像和神經(jīng)網(wǎng)絡(luò)的拖拉機(jī)識(shí)別與定位方法。該方法通過建立YOLO-ZED神經(jīng)網(wǎng)絡(luò)識(shí)別模型,識(shí)別并提取拖拉機(jī)特征;運(yùn)用雙目定位原理計(jì)算拖拉機(jī)相對(duì)本機(jī)的空間位置坐標(biāo)。對(duì)拖拉機(jī)進(jìn)行定點(diǎn)識(shí)別與定位試驗(yàn),分別沿著拖拉機(jī)縱向、寬度方向和S形曲線方向測(cè)量拖拉機(jī)的識(shí)別與定位結(jié)果。試驗(yàn)結(jié)果表明:本文方法能夠在3~10m景深范圍內(nèi)快速、準(zhǔn)確地識(shí)別并定位拖拉機(jī)的空間位置,平均識(shí)別定位速度為19f/s;在相機(jī)景深方向和寬度方向定位拖拉機(jī)的最大絕對(duì)誤差分別為0.720m和0.090m,最大相對(duì)誤差分別為7.48%和8.00%,標(biāo)準(zhǔn)差均小于0.030m,能夠滿足多機(jī)協(xié)同導(dǎo)航作業(yè)對(duì)拖拉機(jī)目標(biāo)識(shí)別的精度和速度要求。

    Abstract:

    With the aim to solve the problems of low identification accuracy of the front tractor, relative positioning difficulty, and difficulty in ensuring the safety of autonomous operation in the multimachine coordinated navigation operation, a method of tractor identification and positioning based on depth image and neural network was proposed. The tractor features were recognized and extracted by establishing YOLO-ZED neural network recognition model. The ZED camera was used to collect 1100 tractor images at different angles, distances, and resolutions in cloudy and sunny days, and the LabelImg marking tool was used to manually mark the collected tractor images, marking the cab as the identification target. The tractor positioning model based on the depth image was established and the binocular positioning principle was used to calculate the spatial position coordinates of the tractor relative to the machine. A fixedpoint identification and positioning test was performed on a small power tractor, and the identification and positioning results of the tractor were measured along the longitudinal, width and Scurve directions of the tractor. The test results showed that the algorithm can quickly and accurately identify and locate the spatial position of the tractor, and the average identification and positioning speed was 19f/s. The maximum absolute error of positioning the tractor in the camera depth direction and width direction was 0.720m and 0.090m, respectively, the maximum relative error was 7.48% and 8.00%, and the standard deviation was less than 0.030m. The accuracy and speed requirements of tractor target identification for multimachine coordinated navigation can be met.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

王亮,翟志強(qiáng),朱忠祥,李臻,杜岳峰,毛恩榮.基于深度圖像和神經(jīng)網(wǎng)絡(luò)的拖拉機(jī)識(shí)別與定位方法[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2020,51(s2):554-560. WANG Liang, ZHAI Zhiqiang, ZHU Zhongxiang, LI Zhen, DU Yuefeng, MAO Enrong. Tractor Identification and Positioning Method Based on Depth Image and Neural Network[J]. Transactions of the Chinese Society for Agricultural Machinery,2020,51(s2):554-560.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2020-08-20
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2020-12-10
  • 出版日期: 2020-12-10
文章二維碼