亚洲一区欧美在线,日韩欧美视频免费观看,色戒的三场床戏分别是在几段,欧美日韩国产在线人成

基于本體與認(rèn)知經(jīng)驗(yàn)的農(nóng)業(yè)機(jī)器人視覺(jué)分類決策方法
作者:
作者單位:

作者簡(jiǎn)介:

通訊作者:

中圖分類號(hào):

基金項(xiàng)目:

國(guó)家自然科學(xué)基金項(xiàng)目(32071912)、廣東省農(nóng)業(yè)科技創(chuàng)新十大主攻方向“揭榜掛帥”項(xiàng)目(2022SDZG03)和廣東省大學(xué)生科技創(chuàng)新培育專項(xiàng)資金項(xiàng)目(pdjh2023a0075)


Visual Classification Decision-making Method for Agricultural Robots Based on Ontology and Cognitive Experience
Author:
Affiliation:

Fund Project:

  • 摘要
  • |
  • 圖/表
  • |
  • 訪問(wèn)統(tǒng)計(jì)
  • |
  • 參考文獻(xiàn)
  • |
  • 相似文獻(xiàn)
  • |
  • 引證文獻(xiàn)
  • |
  • 資源附件
  • |
  • 文章評(píng)論
    摘要:

    基于小樣本數(shù)據(jù)下認(rèn)知經(jīng)驗(yàn)知識(shí)輔助計(jì)算機(jī)進(jìn)行決策,對(duì)實(shí)現(xiàn)農(nóng)業(yè)領(lǐng)域機(jī)器人智能認(rèn)知決策與助力智慧農(nóng)業(yè)發(fā)展具有重要意義。本文在統(tǒng)計(jì)計(jì)數(shù)、支持向量機(jī)(SVM)等圖像屬性信息學(xué)習(xí)方法基礎(chǔ)上,使用Protégé等工具,基于認(rèn)知經(jīng)驗(yàn)構(gòu)建水果識(shí)別分類的專業(yè)知識(shí)庫(kù);然后根據(jù)圖像顏色與形狀信息,進(jìn)行知識(shí)庫(kù)搜索推理得到分類決策。實(shí)驗(yàn)在Fruit360數(shù)據(jù)集中共選擇2091幅葡萄、香蕉、櫻桃水果圖像作為測(cè)試集,并各挑選30幅圖像作為屬性信息訓(xùn)練集與驗(yàn)證集,結(jié)果表明當(dāng)前數(shù)據(jù)下葡萄與櫻桃識(shí)別準(zhǔn)確率為100%,香蕉識(shí)別準(zhǔn)確率為93.30%。僅在知識(shí)庫(kù)添加黃桃知識(shí)后,對(duì)984幅黃桃圖像樣本進(jìn)行測(cè)試,其分類準(zhǔn)確率為97.05%。表明本文方法能有效完成圖像分類決策任務(wù),且具有良好的過(guò)程可解釋性、能力共享性和可拓展性。

    Abstract:

    It is of great significance to realize the intelligent cognitive decision-making ability of robots in the agricultural field and help the further development of smart agriculture that researchers use human cognitive experience and objective knowledge to assist computers and robots in object cognition and behavioral decision-making under the small sample data situation. On the prerequisites of the ability to recognize and judge basic attribute information such as image color and image shape by using methods such as statistical counting and support vector machine(SVM), tools such as Protégé was firstly used to build a professional knowledge base for fruit recognition and classification based on human cognitive experience and objective knowledge in object recognition. Then, under the rules set by artificial experience, the color information and shape information obtained from the image were used as the input of the knowledge base, and the classification results of the items in the image were obtained through matching reasoning. The experiments selected and used 2091 images from the Fruit360 public data set for the first part experiment,which included multiple fruit images of grapes, bananas, and cherries. The research firstly selected 30 images of grapes, bananas and cherries as the training set and validation set for the computers image attribute ability learning, and then the image classification performance was tested on the data set of the first part experiment. The experimental results showed that the image classification accuracy of grapes and cherries was 100%, and that of bananas was 93.30%. Subsequently, totally 984 yellow peach images in the Fruit360 public data set were selected as the data set for the second part experiment. By only adding the knowledge of yellow peach to the professional knowledge base built with ontology technology, the classification accuracy of the images can reach 97.05%. All experimental results showed that the proposed method can effectively accomplish the task of image classification decision-making and the method had good process interpretability, ability sharing and scalability.

    參考文獻(xiàn)
    相似文獻(xiàn)
    引證文獻(xiàn)
引用本文

熊俊濤,廖世盛,梁俊浩,韋婷婷,陳淑綿,鄭鎮(zhèn)輝.基于本體與認(rèn)知經(jīng)驗(yàn)的農(nóng)業(yè)機(jī)器人視覺(jué)分類決策方法[J].農(nóng)業(yè)機(jī)械學(xué)報(bào),2023,54(2):208-215. XIONG Juntao, LIAO Shisheng, LIANG Junhao, EI Tingting, CHEN Shumian, ZHENG Zhenhui. Visual Classification Decision-making Method for Agricultural Robots Based on Ontology and Cognitive Experience[J]. Transactions of the Chinese Society for Agricultural Machinery,2023,54(2):208-215.

復(fù)制
分享
文章指標(biāo)
  • 點(diǎn)擊次數(shù):
  • 下載次數(shù):
  • HTML閱讀次數(shù):
  • 引用次數(shù):
歷史
  • 收稿日期:2022-04-08
  • 最后修改日期:
  • 錄用日期:
  • 在線發(fā)布日期: 2022-04-30
  • 出版日期: