Intelligent navigation algorithm of plant phenotype detection robot based on dynamic credibility evaluation

Wei Lu, Mengjie Zeng, Huanhuan Qin

Abstract


Due to the non-standardization and complexity of the farmland environment, Global Navigation Satellite System (GNSS) navigation signal may be affected by the tree shade, and visual navigation is susceptible to winged insect and mud, which makes the navigation information of the plant phenotype detection robot unreliable. To solve this problem, this study proposed a multi-sensor information fusion intelligent navigation algorithm based on dynamic credibility evaluation. First, three navigation methods were studied: GNSS and Inertial Navigation System (INS) deep coupling navigation, depth image-based visual navigation, and maize image sequence navigation. Then a credibility evaluation model based on a deep belief network was established, which used dynamically updated credibility to intelligently fuse navigation results to reduce data fusion errors and enhance navigation reliability. At last, the algorithm was loaded on the plant phenotype detection robot for experimental testing in the field. The result shows that the navigation error is within 2.7 cm and the navigation effect of the multi-sensor information fusion method is better than that of the single navigation method. The multi-sensor information fusion method proposed in this study uses the credibility model of the deep belief network to perform navigation information fusion, which can effectively solve the problem of reliable navigation of the plant phenotype detection robot in the complex environment of farmland, and has important application prospects.
Keywords: plant phenotype detection, robot, dynamic credibility evaluation, intelligent navigation, multi-sensor information fusion
DOI: 10.25165/j.ijabe.20211406.6615

Citation: Lu W, Zeng M J, Qin H H. Intelligent navigation algorithm of plant phenotype detection robot based on dynamic credibility evaluation. Int J Agric & Biol Eng, 2021; 14(6): 195–206.

Keywords


plant phenotype detection, robot, dynamic credibility evaluation, intelligent navigation, multi-sensor information fusion

Full Text:

PDF

References


Pini M, Marucco G, Falco G, Nicola M, De Wilde W. Experimental testbed and methodology for the assessment of RTK GNSS receivers used in precision agriculture. IEEE Access, 2020; 8(1): 14690-14703.

Yang J, Du T, Liu X, Niu B, Guo L. Method and implementation of a bioinspired polarization-based attitude and heading reference system by integration of polarization compass and inertial sensors. IEEE Transactions on Industrial Electronics, 2020; 67(11): 9802-9812.

Lu W, Zeng M J, Wang L, Luo H, Mukherjee S, Huang X H, et al. Navigation algorithm based on the boundary line of tillage soil combined with guided filtering and improved anti-noise morphology. Sensors, 2019; 19(18): 3918. doi: 10.3390/s19183918.

Bolourian N, Hammad A. LiDAR-equipped UAV path planning considering potential locations of defects for bridge inspection. Automation in Construction, 2020; 117(16): 103250. doi: 10.1016/j.autcon.2020.103250.

Zhu J P, Cen H Y, He L W, He Y. Development and performance evaluation of a multi-rotor unmanned aircraft system for agricultural monitoring. Smart Agriculture, 2019; 1(1): 43-52. (in Chinese)

Li S C, Zhang M, Cao R Y, Ji Y H, Zhang Z Q, Li H, et al. Development of the automatic navigation system for combine harvester based on GNSS. Int J Agric & Biol Eng, 2021; 14(5): 163–171.

Zhou H, Hu L, Luo X W, Tang L M, Du P, Mao T, et al. Design and test of laser-controlled paddy field levelling-beater. Int J Agric & Biol Eng, 2020; 13(1): 57–65.

Tayebi A, Gomez J, Fernández M, de Adana F S, Gutiérrez O. Low-cost experimental application of real-time kinematic positioning for increasing the benefits in cereal crops. Int J Agric & Biol Eng, 2021; 14(3): 194–199.

Huang Z, Ono M, Shiigi T, Suzuki T, Habaragamuwa H, Nakanishi H, et al. Is spread spectrum sound a robust local positioning system for a quadcopter operating in a greenhouse? Chemical Engineering Transactions, 2017; 58(1): 829-834.

Tientadakul R, Nakanishi H, Shiigi T, Huang Z, Jacky Tsay L W, Kondo N. Indoor navigation system by combining ultrasonic wave TOA and inertial measurement. 2020 59th Annual Conference of the Society of Instrument and Control Engineers of Japan (SICE), 2020; pp.1690-1695. doi: 10.23919/SICE48898.2020.9240233.

de Miguel M Á, García F, Armingol J M. Improved LiDAR probabilistic localization for autonomous vehicles using GNSS. Sensors, 2020; 20(11): 3145. doi: 10.3390/s20113145.

Kovacs G, Kunii Y, Maeda T, Hashimoto H. Trajectory estimation and position correction for hopping robot navigation using monocular camera. ROBOMECH Journal, 2020; 7(1): 25. doi:10.1186/s40648-020-00172-3.

Ma Y, Zhang W Q, Qureshi W S, Gao C, Zhang C L, Li W. Autonomous navigation for a wolfberry picking robot using visual cues and fuzzy control. Information Processing in Agriculture, 2021; 8(1): 15-26. doi: 10.1016/j.inpa.2020.04.005.

Velasquez A E B, Higuti V A H, Guerrero H B, Gasparino M V, Magalhaes D V, Aroca R V, et al. Reactive navigation system based on H∞ control system and LiDAR readings on corn crops. Precision Agriculture, 2020; 21(2): 349-368.

Huang Z, Tsay L W J, Zhao X, Fukuda H, Shiigi T, Nakanishi H, et al. Position and orientation measurement system using spread spectrum sound for greenhouse robots. Biosystems Engineering. 2020; 198(1): 50-62.

Elghamrawy H, Karaim M, Tamazin M, Noureldin A. Experimental evaluation of the impact of different types of jamming signals on commercial GNSS receivers. Applied Sciences, 2020; 10(12): doi: 4240. 10.3390/app10124240.

Mahmud M S A, Abidin M S Z, Mohamed Z, Rahman M K I A, Buyamin S. Enhanced probabilistic roadmap for robot navigation in virtual greenhouse environment. Modeling, Design and Simulation of Systems, 2017; 752(1): 172-182.

Jasinski M, Maczak J, Radkowski S, Korczak S, Rogacki R, Mac J, et al. Autonomous agricultural robot-conception of inertial navigation system. Challenges in Automation, Robotics and Measurement Techniques, Springer, 2016; 440(1): 669-679.

Alberto-Rodríguez A, Neri-Muñoz M, Fernández, J R, Márquez-Vera M A, Ramos-Velasco L E, Díaz-Parra O, et al. Review of control on agricultural robot tractors. International Journal of Combinatorial Optimization Problems and Informatics, 2020; 11(3): 9-20.

Aghi D, Mazzia V, Chiaberge M. Local motion planner for autonomous navigation in vineyards with a RGB-D camera-based algorithm and deep learning synergy. Machines, 2020; 8(2): 27. doi: 10.3390/machines8020027.

Kolar P, Benavidez P, Jamshidi M. Survey of data fusion techniques for laser and vision based sensor integration for autonomous navigation. Sensors, 2020; 20(8): 2180. doi: 10.3390/s20082180.

He W, Lian B W, Zhang L L. GNSS/INS ultra-tightly integrated tracking algorithm assisted by fuzzy control. Journal of Northwestern Polytechnical University, 2016; 34(1): 98-105. (in Chinese)

Yang G C, Li Q, He W. A new method of initial alignment assisted by pseudo-satellites. Navigation Positioning & Timing, 2017; 4(3): 17-21. ( in Chinese)

Jin X, Li R S, Ji J T, Yuan Y W, Li M Y. Obstacle avoidance transplanting method based on Kinect visual processing. Int J Agric & Biol Eng, 2021; 14(5): 72–78.

Li HY, Liu H, Cao N, Peng Y, Xie SR, Luo, J, et al. Real-time RGB-D image stitching using multiple Kinects for improved field of view. International Journal of Advanced Robotic Systems, 2017; 14(2): 172988141769556. doi: 10.1177/1729881417695560.

Hasan S M A, Ko K. Depth edge detection by image-based smoothing and orphological operations. Journal of Computational Design and Engineering. 2016; 3(1):191-197.

Paulo F, Zhao Z, Jithin M, Nusrat J, John S. Distinguishing volunteer corn from soybean at seedling stage using images and machine learning. Smart Agriculture, 2020; 2(3): 61-74.

Qu Y D, Cui C S, Chen S B, Li J Q. A fast subpixel edge detection method using Sobel-Zernike moments operator. Image and Vision Computing, 2005; 23(1): 11-17.

Grafova L, Kasparova M, Kakawand S, Prochazka A, Dostalova T. Study of edge detection task in dental panoramic radiographs. Dentomaxillofacial Radiology, 2013; 42(7): 720120391. doi: 10.1259/dmfr.20120391.

Li K, He F Z, Yu H P. Robust visual tracking based on convolutional features with illumination and occlusion handing. Journal of Computer Science and Technology, 2018; 33(1): 223-236.

Rosten E, Drummond T. Fusing points and lines for high performance tracking. In: 10th IEEE International Conference on Computer Vision (ICCV 05), Beijing: IEEE, 2005; 1: 1508-1515. doi: 10.1109/ICCV.2005.104.

Rosten E, Drummond T. Machine learning for high-speed corner detection. In: Computer Vision – ECCV 2006, Springer, 2006; 3951(1):430-443.

Mikolajczyk K, Schmid C. A performance evaluation of local descriptors. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2005; 27(10): 1615-1630.

Lindeberg T. Image matching using generalized scale-space interest points. Journal of Mathematical Imaging and Vision, 2015; 52(1): 3-36.

Stanhope T P, Adamchuk V I. Feature-based visual tracking for agricultural implements. IFAC PapersOnline, 2016; 49(16): 359-364.

Hinton G E, Osindero S, The Y W. A fast learning algorithm for deep belief nets. Neural Computation, 2006; 18(7): 1527-1554.

Mohamed A R, Dahl G E, Hinton G. Acoustic modeling using deep belief networks. IEEE Transactions on Audio Speech and Language Processing, 2012; 20(1): 14-22.

Li H W, Xu B S, Du C H, Yang Y. Performance prediction and power density maximization of a proton exchange membrane fuel cell based on deep belief network. Journal of Power Sources, 2020; 461(1): 228154. doi: 10.1016/j.jpowsour.2020.228154.

Liu W B, Wang Z D, Liu X H, Zeng N Y, Liu Y R, Alsaadi F E. A survey of deep neural network architectures and their applications. Neurocomputing, 2017; 234(1): 11-26.

Xu J, Yang G, Sun Y, Picek S. A multi-sensor information fusion method based on factor graph for integrated navigation system. IEEE Access, 2021; 9(1): 12044-12054.

Shamshiri R R, Weltzien C, Hameed I A, Yule I J, Grift T E, Balasundram S K, et al. Research and development in agricultural robotics: A perspective of digital farming. Int J Agric & Biol Eng, 2018; 11(4): 1–14.




Copyright (c) 2021 International Journal of Agricultural and Biological Engineering

Creative Commons License
This work is licensed under a Creative Commons Attribution 4.0 International License.

2023-2026 Copyright IJABE Editing and Publishing Office