Recognition and localization system of the robot for harvesting Hangzhou White Chrysanthemums

Qinghua Yang, Chun Chang, Guanjun Bao, Jun Fan, Yi Xun

Abstract


To realize the robotic harvesting of Hangzhou White Chrysanthemums, the quick recognition and 3D vision localization system for target Chrysanthemums was investigated in this study. The system was comprised of three main stages. Firstly, an end-effector and a simple freedom manipulator with three degrees were designed to meet the quality requirements of harvesting Hangzhou White Chrysanthemums. Secondly, a segmentation based on HSV color space was performed. A fast Fuzzy C-means (FCM) algorithm based on S component was proposed to extract the target image from irrelevant background. Thirdly, binocular stereo vision was used to acquire the target spatial information. According to the shape of Hangzhou White Chrysanthemums, the centroids of stamens were selected as feature points to match in the right and left images. The experimental results showed that the proposed method was able to recognize Hangzhou White Chrysanthemums with the accuracy of 85%. When the distance between target and baseline was 150-450 mm, the errors between the calculated and measured distance were less than 14 mm, which could meet the requirements of the localization accuracy of the harvesting robot.
Keywords: Hangzhou White Chrysanthemums, harvesting robot, recognition, localization, Fuzzy C-means (FCM), binocular vision, stereo matching
DOI: 10.25165/j.ijabe.20181101.3683

Citation: Yang Q H, Chang C, Bao G J, Fan J, Xun Y. Recognition and localization system of the robot for harvesting Hangzhou White Chrysanthemums. Int J Agric & Biol Eng, 2018; 11(1): 88–95.

Keywords


Hangzhou White Chrysanthemums, harvesting robot, recognition, localization, Fuzzy C-means (FCM), binocular vision, stereo matching

Full Text:

PDF

References


Xu D H, Jiang B Q. The prediction and analysis of industrial machinery and equipment in Hangzhou White chrysanthemums. South Agricultural Machinery, 2008; 6: 39–40. (in Chinese)

Xiao H R, Qin G M, Song Z Y. Study on the development strategy of mechanization of tea production. China Tea, 2011; 7: 8-11. (in Chinese).

Chen Y. The main problems and solutions of Hangzhou White chrysanthemums industry. Zhejiang Agricultural Science, 2014; 8: 1166–1168. (in Chinese).

Xia S P. Study on the current situation development countermeasures of Chrysanthemum Morifolium industrialization in Tongxiang country. Master Dissertation. Zhejiang University, 2008. (in Chinese)

Li H, Xu L. The development and prospect of agricultural robots in China. Acta Agriculturae Zhejiangensis, 2015; 27(5): 865–871.

Luo L F, Tang Y C, Zou X J, Ye M, Feng W X, Li G Q. Vision-based extraction of spatial information in grape clusters for harvesting robots. Biosystems Engineering, 2016; 151: 90–104.

Cubero S, Aleixos N, Moltó E, J Gómez-Sanchis, Blasco J. Erratum to: Advances in machine vision applications for automatic inspection and quality evaluation of fruits and vegetables. Food and Bioprocess Technology, 2011; 4(5): 829–830.

Zhao D A, Lv J, Ji W, Chen Y. Design and control of an apple harvesting robot. Biosystems Engineering, 2011; 110(2): 112–122.

Nagle M, Intani K, Romano G, Mahayothee B, Sardsud V, Müller J. Determination of surface color of ‘all yellow’ mango cultivars using computer vision. Int J Agric & Biol Eng, 2016; 9(1): 42–50.

Zhang B, Huang W, Li J, Zhao C, Fan S, Wu J, et al. Principles, developments and applications of computer vision for external quality inspection of fruits and vegetables: A review. Food Research International, 2014; 62(62): 326–343.

Kondo N, Shibano Y, Mohri K, Monta M. Basic studies on robot to work in vineyard (Part 2). Journal of the Japanese Society of Agricultural Machinery, 1994; 56(1): 45–53.

Arefi A, Motlagh A M, Mollazade K, Teimourlou R F. Recognition and localization of ripen tomato based on machine vision. Australian Journal of Crop Science, 2011; 5(10): 1144–1149.

Wang P R. Target identification algorithm under partial occlusion. Computer Engineering and Design, 2009; 30(12): 3009–3011.

Sun G X, Li Y B, Wang X C, Hu G Y, Wang X, Zhang Y. Image segmentation algorithm for greenhouse cucumber canopy under various natural lighting conditions. Int J Agric & Biol Eng, 2016; 9(3): 130–138

Hu J T, Li T C. Cascaded navigation control for agricultural vehicles tracking straight paths. Int J Agric & Biol Eng, 2014; 7(1): 36–44.

Si Y S, Liu G, Feng J. Location of apples in trees using stereoscopic vision. Computers and Electronics in Agriculture, 2015; 112: 68–74.

Monta M, Namba K, Kondo N. Three dimensional sensing system using laser scanners. Transactions of the ASAE, 2004; 2: 1216–1221.

Cai J R, Sun H B, Li Y P, Sun L, Lu H Z. Fruit trees 3-D information perception and reconstruction based on binocular stereo vision. Transactions of the CSAM, 2012; 43(3): 152–156. (in Chinese)

Mehta S S, Burks T F. Vision-based control of robotic manipulator for citrus harvesting. Computers and Electronics in Agricultural, 2014; 102(1): 146–158.

Li J, Cui S J, Zhang C Y, Chen H F. Research on localization of apples based on binocular stereo vision marked by cancroids matching. International Conference on Digital Manufacturing and Automation, 2012; pp.683–686.

Xiang R, Jiang H Y, Ying Y B. Recognition of clustered tomatoes based on binocular stereo vision. Computers and Electronics in Agriculture, 2014; 106: 75–90.

Jiménez A R, Ceres R, Pons J L. A vision system based on a laser range-finder applied to robotic fruit harvesting. Machine Vision Application, 2000; 11(6): 321–329.

Feng Q C, Cheng W, Zhou J J, Wang X. Design of structured-light vision system for tomato harvesting robot. Int J Agric & Biol Eng, 2014; 7(2): 19–26.

Deng X L, Lan Y B, Hong T S, Chen J X. Citrus greening detection using visible spectrum imaging and C-SVC. Computers and Electronics in Agriculture, 2016; 130: 177–183.

Smith A R. Color gamut transformation pairs. Computer Graphics, 1978; 12(3): 12–19.

Zou X J, Zou H X, Lu J. Virtual manipulator-based binocular stereo vision positioning system and errors modeling. Machine Vision and Applications, 2012; 23(1): 43–63.




Copyright (c)



2023-2026 Copyright IJABE Editing and Publishing Office