Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC

Mingkuan Zhou, Junfang Xia, Fang Yang, Kan Zheng, Mengjie Hu, Dong Li, Shuai Zhang

Abstract


The objective of this study was to develop a visual navigation system capable of navigating an unmanned ground vehicle (UGV) travelling between tree rows in the outdoor orchard. Thus far, while most research has developed algorithms that deal with ground structures in the orchard, this study focused on the background of canopy plus sky to eliminate the interference factors such as inconsistent lighting, shadows, and color similarities in features. Aiming at the problem that the traditional Hough transform and the least square method are difficult to be applied under outdoor conditions, an algorithm combining Hough matrix and random sample consensus (RANSAC) was proposed to extract the navigation path. In the image segmentation stage, this study used an H-component that was adopted to extract the target path of the canopy plus sky. Then, after denoising and smoothing the image by morphological operation, line scanning was used to determine the midpoint of the target path. For navigation path extraction, this study extracted the feature points through Hough matrix to eliminate the redundant points, and RANSAC was used to reduce the impact of the noise points caused by different canopy shapes and fit the navigation path. The path acquisition experiment proved that the accuracy of Hough matrix and RANSAC method was 90.36%-96.81% and the time consumption of the program was within 0.55 s under different sunlight intensities. This method was superior to the traditional Hough transform in real-time and accuracy, and had higher accuracy, slightly worse real-time compared with the least square method. Furthermore, the OPENMV was used to capture the ground information of the orchard. The experiment proved that the recognition rate of OPENMV for identifying turning information was 100%, and the program running time was 0.17-0.19 s. Field experiments showed that the UGV could autonomously navigate the rows with a maximum lateral error of 0.118 m and realize the automatic turning of the UGV. The algorithm satisfied the practical operation requirements of autonomous vehicles in the orchard. So the UGV has the potential to guide multipurpose agricultural vehicles in outdoor orchards in the future.
Keywords: visual navigation, unmanned ground vehicle, Hough matrix, RANSAC algorithm, orchard, H-component
DOI: 10.25165/j.ijabe.20211406.5953

Citation: Zhou M K, Xia J F, Yang F, Zheng K, Hu M J, Li D, et al. Design and experiment of visual navigated UGV for orchard based on Hough matrix and RANSAC. Int J Agric & Biol Eng, 2021; 14(6): 176–184.

Keywords


visual navigation, unmanned ground vehicle, Hough matrix, RANSAC algorithm, orchard, H-component

Full Text:

PDF

References


Lemos RAD, Nogueira LACDO, Ribeiro AM, Mirisola L G B, Koyama M F, de Paiva E C, et al. Unisensory intra-row navigation strategy for orchards environments based on sensor laser. Congresso Brasileiro de Automática, 2018; 22: 0400. doi: 10.20906/CPS/CBA2018-0400.

Thanpattranon P, Ahamed T, Takigawa T. Navigation of autonomous tractor for orchards and plantations using a laser range finder: Automatic control of trailer position with tractor. Biosystems Engineering, 2016; 147: 90–103.

Blok P M, van Boheemen K, van Evert F K, IJsselmuiden J, Kim G-H. Robot navigation in orchards with localization based on Particle filter and Kalman filter. Computers and Electronics in Agriculture, 2019; 157: 261–269.

Passalaqua B P, Molin J P. Path errors in sugarcane transshipment trailers. Engenharia Agrícola, 2020; 40(2): 223–231.

Luo C, Mohsenimanesh A, Laguë C. Parallel point-to-point tracking for agricultural Wide-Span Implement Carrier (WSIC). Computers and Electronics in Agriculture, 2018, 153: 302–312.

Sumesh K C, Ninsawat S, Som-ard J. Integration of RGB-based vegetation index, crop surface model and object-based image analysis approach for sugarcane yield estimation using unmanned aerial vehicle. Computers and Electronics in Agriculture, 2021, 180: 105903. doi: 10.1016/j.compag.2020.105903.

Meng Q K, He J, Qiu R C, Ma X D, Si Y S, Zhang M, et al. Crop recognition and navigation line detection in natural environment based on machine vision. Acta Optica Sinica, 2014; 34(7): 180–186. (in Chinese)

Oksanen T. Laser scanner based collision prevention system for autonomous agricultural tractor. Agronomy Research, 2015; 13(1): 167–172.

Zhang H C, Zheng J Q, Dorr G, Zhou H P, Ge Y F. Testing of GPS accuracy for precision forestry applications. Arabian Journal for Science and Engineering, 2014; 39(1): 237–245.

Bengochea-Guevara J M, Conesa-Muñoz J, Andújar D, Ribeiro A. Merge fuzzy visual servoing and GPS-based planning to obtain a proper navigation behavior for a small crop-inspection robot. Sensors, 2016; 16(3): 276. doi: 10.3390/s16030276.

Choi K H, Han S K, Han S H, Park K H, Kim K S, Kim S. Morphology-based guidance line extraction for an autonomous weeding robot in paddy fields. Computers and Electronics in Agriculture, 2015; 113: 266–274.

Yao L J, Hu D, Yang Z D, Li H B, Qian M B. Depth recovery for unstructured farmland road image using an improved SIFT algorithm. Int J Agric & Biol Eng, 2019; 12(4): 141–147.

Chang Q X, Xiong Z K. Vision-aware target recognition toward autonomous robot by Kinect sensors. Signal Processing: Image Communication, 2020; 84: 115810. doi: 10.1016/j.image.2020.115810.

Ma Y, Zhang W Q, Qureshi W S, Gao C, Zhang C L, Li W. Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control. Information Processing in Agriculture, 2020; 8(1): 15–26.

Yang S J, Mei S L, Zhang Y N. Detection of maize navigation centerline based on machine vision. IFAC-PapersOnLine, 2018; 51(17): 570–575.

Si Y S, Jiang G Q, Liu G, Gao R, Liu Z X. Early stage crop rows detection based on least square method. Transactions of the Chinese Society of Agricultural Machinery, 2010; 41(7): 163–167, 185. (in Chinese)

Hu L, Luo X W, Zhang Z G, Chen X F, Lin C X. Side-shift offset identification and control of crop row tracking for intra-row mechanical weeding. Transactions of the CSAE, 2013; 29(14): 8–14. (in Chinese)

Zhang R J, Li M Z, Zhang M, Liu G. Rapid crop-row detection based on improved Hough transformation. Transactions of the Chinese Society for Agricultural Machinery, 2009; 40(7): 163–166. (in Chinese)

Mochizuki Y, Torii A, Imiya A. N-Point Hough transform for line detection. Journal of Visual Communication and Image Representation, 2009; 20(4): 242–253.

Mukhopadhyay P, Chaudhuri B B. A survey of Hough Transform. Pattern Recognition, 2015, 48(3): 993–1010.

Vera E, Lucio D, Fernandes L A F, Velho L. Hough Transform for real-time plane detection in depth images. Pattern Recognition Letters, 2018; 103: 8–15.

Barawid O C, Mizushima A, Ishii K, Noguchi N. Development of an Autonomous Navigation System using a Two-dimensional Laser Scanner in an Orchard Application. Biosystems Engineering, 2007; 96(2): 139–149.

Chen Z W, Li W, Zhang W Q, Li Y W, Li M S, Li H. Vegetable crop row extraction method based on accumulation threshold of Hough Transformation. Transactions of the CSAE, 2019; 35(22): 314–322. (in Chinese)

Chen J Q, Qiang H, Wu J H, Xu G W, Wang Z K. Navigation path extraction for greenhouse cucumber-picking robots using the prediction-point Hough transform. Computers and Electronics in Agriculture, 2021; 180: 105911. doi: 10.1016/j.compag.2020.105911.

Li Y, Gans N R. Predictive RANSAC: Effective model fitting and tracking approach under heavy noise and outliers. Computer Vision and Image Understanding, 2017; 161: 99–113.

Zhou S Z, Kang F , Li W B, Kan J M, Zheng Y J, He G J. Extracting diameter at breast height with a handheld mobile LiDAR system in an outdoor environment. Sensors (Basel, Switzerland), 2019; 19(14): 3212. doi: 10.3390/s19143212.

Zhu R J, Zhu Y H, Wang L, Lu W, Luo H, Zhang Z C. Cotton positioning technique based on binocular vision with implementation of scale-invariant feature transform algorithm. Transactions of the CSAE, 2016; 32(6): 182–188. (in Chinese).

Sun Q, Zhang Y, Wang J G, Gao W. An improved FAST feature extraction based on RANSAC method of vision/SINS integrated navigation system in GNSS-denied environments. Advances in Space Research, 2017; 60(12): 2660–2671.

Bochtis D, Griepentrog H W, Vougioukas S, Busato P, Beruto R, Zhou K. Route planning for orchard operations. Computers and Electronics in Agriculture, 2015; 113: 51–60.

Li Y, Ding W L, Zhang X G, Ju Z J. Road detection algorithm for Autonomous Navigation Systems based on dark channel prior and vanishing point in complex road scenes. Robotics and Autonomous Systems, 2016; 85: 1–11.

Narayan A, Tuci E, Labrosse F, Alkilabi M H M. A dynamic colour perception system for autonomous robot navigation on unmarked roads. Neurocomputing, 2018; 275: 2251–2263.

Li J B, Zhu R G, Chen B Q. Image detection and verification of visual navigation route during cotton field management period period. Int J Agric & Biol Eng, 2018; 11(6): 159–165.

Radcliffe J, Cox J, Bulanon D M. Machine vision for orchard navigation. Computers in Industry, 2018; 98: 165–17




Copyright (c) 2021 International Journal of Agricultural and Biological Engineering

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

2023-2026 Copyright IJABE Editing and Publishing Office