亚洲一区欧美在线,日韩欧美视频免费观看,色戒的三场床戏分别是在几段,欧美日韩国产在线人成

基于相機(jī)與激光雷達(dá)融合的溫室機(jī)器人行間導(dǎo)航方法
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類(lèi)號(hào):

基金項(xiàng)目:

國(guó)家重點(diǎn)研發(fā)計(jì)劃項(xiàng)目(2020AAA0108103)、中國(guó)科學(xué)院機(jī)器人與智能制造創(chuàng)新研究院自主項(xiàng)目(C2021002)和中國(guó)科學(xué)院合肥物質(zhì)科學(xué)研究院院長(zhǎng)基金項(xiàng)目(YZJJZX202013)


Inter-rows Navigation Method of Greenhouse Robot Based on Fusion of Camera and LiDAR
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問(wèn)統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    針對(duì)溫室顛簸不平、枝葉遮擋道路的復(fù)雜環(huán)境,開(kāi)展基于相機(jī)與激光雷達(dá)數(shù)據(jù)融合的機(jī)器人行間導(dǎo)航方法研究。首先,利用改進(jìn)的U-Net模型實(shí)現(xiàn)圖像道路區(qū)域的準(zhǔn)確快速分割;其次,通過(guò)融合圖像分割結(jié)果進(jìn)行地面點(diǎn)云預(yù)分割,減少地面起伏造成的點(diǎn)云傾斜;然后,采用改進(jìn)的KMeans算法實(shí)現(xiàn)作物行點(diǎn)云快速聚類(lèi),并將聚類(lèi)中心作為作物行主干區(qū)域點(diǎn),降低枝葉遮擋對(duì)作物行中線提取的影響;最后,采用RANSAC算法擬合兩側(cè)作物行方程并計(jì)算出導(dǎo)航線。通過(guò)實(shí)驗(yàn)評(píng)估導(dǎo)航線精度,在測(cè)試集中94%以上數(shù)據(jù)幀可以準(zhǔn)確實(shí)現(xiàn)提取導(dǎo)航線,平均角度誤差不高于1.45°,滿足溫室機(jī)器人沿作物行自主導(dǎo)航行駛要求。

    Abstract:

    Aiming at the complex greenhouse environment where the ground is bumpy and the branches and leaves block the road, the research on the inter-rows navigation method of greenhouse robot based on the fusion of camera and LiDAR data was carried out. Firstly, the improved U-Net model was used to realize the accurate and fast segmentation of image road area. Secondly, the ground point cloud was pre-segmented by fusing the image segmentation result to reduce the incline of the point cloud data caused by the ground bumpiness. Then, the improved KMeans algorithm was used to realize the rapid clustering of the crop row point cloud, and the cluster centers were used as the main area points of crop rows to reduce the influence of branches and leaves blocking the road on extraction of crop row centerline. Finally, the RANSAC algorithm was used to fit the crop row equations on both sides and calculate the navigation lines. The navigation line accuracy was evaluated by experiment, the validation work was conducted in two greenhouse scenarios at three typical greenhouse robot operation speeds. The experimental results showed that the performance and timing of the segmented images met the requirements of subsequent point cloud pre-segmentation;the experiment of point cloud data frames by bumpy environment can effectively calibrate the ground point cloud;compared with the raster height difference segmentation of ground point cloud, the segmentation effect was better and the time consumption of single frame processing was increased very little;on the test set, more than 94% of the data frames can accurately extract the navigation line and the average angle error was not higher than 1.45°. The research result can meet the greenhouse robot along the crop row autonomous navigation driving requirements.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

王杰,陳正偉,徐照勝,黃滋棟,經(jīng)俊森,牛潤(rùn)新.基于相機(jī)與激光雷達(dá)融合的溫室機(jī)器人行間導(dǎo)航方法[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2023,54(3):32-40. WANG Jie, CHEN Zhengwei, XU Zhaosheng, HUANG Zidong, JING Junsen, NIU Runxin. Inter-rows Navigation Method of Greenhouse Robot Based on Fusion of Camera and LiDAR[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(3):32-40.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2022-04-20
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2023-03-10
  • 出版日期: