欢迎访问林业科学,今天是

林业科学 ›› 2017, Vol. 53 ›› Issue (11): 120-130.doi: 10.11707/j.1001-7488.20171114

• 论文与研究报告 • 上一篇    下一篇

基于机器视觉的蛾类三维姿态中前翅间夹角计算方法

张睿珂1,2, 陈梅香2, 李明2, 杨信廷2, 温俊宝1   

  1. 1. 北京林业大学 林木有害生物防治北京市重点实验室 北京 100083;
    2. 国家农业信息化工程技术研究中心 农业部农业信息技术重点开放实验室 北京市农业物联网工程技术研究中心 北京 100097
  • 收稿日期:2017-01-05 修回日期:2017-06-11 出版日期:2017-11-25 发布日期:2017-12-13
  • 基金资助:
    北京市自然科学基金项目(4132027);北京市自然科学基金青年项目(6164034);国家自然科学基金青年科学基金项目(31301238)。

Method of Extracting Forewings Angle of 3D Pose for the Moth Based on Machine Vision

Zhang Ruike1,2, Chen Meixiang2, Li Ming2, Yang Xinting2, Wen Junbao1   

  1. 1. Beijing Key Laboratory for Forest Pest Control Beijing Forestry University Beijing 100083;
    2. National Engineering Research Center for Information Technology in Agriculture Key Laboratory for Information Technologies in Agriculture, Ministry of Agriculture Beijing Engineering Research Center for Agricultural IOT Beijing 100097
  • Received:2017-01-05 Revised:2017-06-11 Online:2017-11-25 Published:2017-12-13

摘要: [目的]在农林业害虫自动识别分类过程中,目标蛾类三维姿态的准确获取可以优化识别过程,提高识别效率。通过对复杂的蛾类害虫三维姿态进行量化,准确获取虫体三维姿态的信息数据,可克服二维姿态识别的信息缺失问题,提高算法的鲁棒性,为蛾类虫体的自动识别奠定基础。[方法]以棉铃虫为例,提出一种基于机器视觉原理的蛾类虫体前翅间夹角计算方法,以确定虫体的三维姿态,即:通过角点检测原理提取蛾类虫体前翅的标记特征点,获取标记特征点的空间坐标,进而计算虫体前翅间夹角角度。[结果]此方法能够快速、便捷、准确地获取棉铃虫成虫虫体前翅间夹角,且相对误差0.10%~3.96%;该计算方法与激光测量进行偏差分析,均方根误差为1.421 6;配对T检验无显著性差异,表明本文提出的方法可行。[结论]以棉铃虫为例提出一种基于机器视觉的标记特征点虫体前翅间夹角计算方法,平均用时仅14.6 s,少于激光测量法的1 min,在计算效率上也有所提高,为多姿态蛾类害虫的自动监测、快速识别提供重要的技术手段。

关键词: 机器视觉, 三维姿态, 蛾类, 前翅间夹角, 特征点提取

Abstract: [Objective] In this study, the 3D gesture of complex moth pests was quantified, and the information of 3D gesture of insects was acquired accurately, which was ableto overcome the problems of missing information in 2D images recognition,and improve the robustness of the algorithm.[Method]This study used Helicoverpa armigera (Lepidoptera:Noctuidae) as the experimental object. Firstly, the images were obtained in the closed box, in which three cameras were set. Triangle three-dimensional coordinate system was made up of 1 cm×1 cm white grid plate and ring light source. Before being preprocessed, the images were cropped into 935 pixels×568 pixels to get the scope of the target moth pest. In order to enhance the visibility of the target area, the RGB and HSV color space was transformed. The H, S, V component grayscale images were obtained, respectively. Comparing three component grayscale, it was obviously that S component grayscale image can maintain the integrity of the image target site effectively. After the above image preprocessing, it was appeared a lot of noise in the image, using median filter to remove isolated noise points and it also can be keep the image edge features. Secondly, the mark point on the moth pest forewings was extracted by the Harris corner extraction method, then the pixel coordinates of feature points were obtained. Later, the reference object, white coordinates plate was 366 pixels and its actual size was 1 cm. Thus the calibration coefficient was 1/366 mm·pixel-1. Finally, according to the principle of space geometry, the forewings angle of H. armigera was calculated by MATLAB.[Result]The result of preprocessing showed that the image segmentation based on color space conversion could not only weaken the brightness of the background, but also maintain the all of target moth pest. Based on these, we could get accurately the forewings angle of the moth pest. At last, the calculated results and the laser measurement ones were compared. The experiment results showed that the relative error was between 0.10% and 3.96%, and the minimum root mean square error (RMSE) value was 1.421 6, and showed that there was no significant difference between the calculated results and the manual measurement by the paired T test. In addition, it was found that the calculation result of the forewings angle of the moth pest had larger error, and the reason was that each mark point was obtained by manual.[Conclusion]In conclusion, the paper proposes a new approach to acquire the forewings angle of the H.armigera, and the calculated results are consistent with the results of manual measurement, which could provide data of 3D gesture. At the same time, the algorithm is only 14.6 s, less than that obtained by the laser measuring method. It has also improved computing efficiency. This paper approach could improve the accuracy of moth pest identification, robustness and it has important significance in the future practical application.

Key words: machine vision, 3D pose, moth, forewings angle, corner detection

中图分类号: