Welcome to visit Scientia Silvae Sinicae,Today is

Scientia Silvae Sinicae ›› 2017, Vol. 53 ›› Issue (11): 120-130.doi: 10.11707/j.1001-7488.20171114

Previous Articles     Next Articles

Method of Extracting Forewings Angle of 3D Pose for the Moth Based on Machine Vision

Zhang Ruike1,2, Chen Meixiang2, Li Ming2, Yang Xinting2, Wen Junbao1   

  1. 1. Beijing Key Laboratory for Forest Pest Control Beijing Forestry University Beijing 100083;
    2. National Engineering Research Center for Information Technology in Agriculture Key Laboratory for Information Technologies in Agriculture, Ministry of Agriculture Beijing Engineering Research Center for Agricultural IOT Beijing 100097
  • Received:2017-01-05 Revised:2017-06-11 Online:2017-11-25 Published:2017-12-13

Abstract: [Objective] In this study, the 3D gesture of complex moth pests was quantified, and the information of 3D gesture of insects was acquired accurately, which was ableto overcome the problems of missing information in 2D images recognition,and improve the robustness of the algorithm.[Method]This study used Helicoverpa armigera (Lepidoptera:Noctuidae) as the experimental object. Firstly, the images were obtained in the closed box, in which three cameras were set. Triangle three-dimensional coordinate system was made up of 1 cm×1 cm white grid plate and ring light source. Before being preprocessed, the images were cropped into 935 pixels×568 pixels to get the scope of the target moth pest. In order to enhance the visibility of the target area, the RGB and HSV color space was transformed. The H, S, V component grayscale images were obtained, respectively. Comparing three component grayscale, it was obviously that S component grayscale image can maintain the integrity of the image target site effectively. After the above image preprocessing, it was appeared a lot of noise in the image, using median filter to remove isolated noise points and it also can be keep the image edge features. Secondly, the mark point on the moth pest forewings was extracted by the Harris corner extraction method, then the pixel coordinates of feature points were obtained. Later, the reference object, white coordinates plate was 366 pixels and its actual size was 1 cm. Thus the calibration coefficient was 1/366 mm·pixel-1. Finally, according to the principle of space geometry, the forewings angle of H. armigera was calculated by MATLAB.[Result]The result of preprocessing showed that the image segmentation based on color space conversion could not only weaken the brightness of the background, but also maintain the all of target moth pest. Based on these, we could get accurately the forewings angle of the moth pest. At last, the calculated results and the laser measurement ones were compared. The experiment results showed that the relative error was between 0.10% and 3.96%, and the minimum root mean square error (RMSE) value was 1.421 6, and showed that there was no significant difference between the calculated results and the manual measurement by the paired T test. In addition, it was found that the calculation result of the forewings angle of the moth pest had larger error, and the reason was that each mark point was obtained by manual.[Conclusion]In conclusion, the paper proposes a new approach to acquire the forewings angle of the H.armigera, and the calculated results are consistent with the results of manual measurement, which could provide data of 3D gesture. At the same time, the algorithm is only 14.6 s, less than that obtained by the laser measuring method. It has also improved computing efficiency. This paper approach could improve the accuracy of moth pest identification, robustness and it has important significance in the future practical application.

Key words: machine vision, 3D pose, moth, forewings angle, corner detection

CLC Number: