Enhanced tree trunk detection for the autonomous field mower via LiDAR-camera fusion in complex environments

Authors

  • Jie Ji 1. College of Engineering and Technology, Southwest University, Chongqing 400715, China
  • Jianhang Yang 1. College of Engineering and Technology, Southwest University, Chongqing 400715, China
  • Mengling Wang 1. College of Engineering and Technology, Southwest University, Chongqing 400715, China
  • Bohan Zhang 1. College of Engineering and Technology, Southwest University, Chongqing 400715, China
  • Yang Liu 2. China Automotive Engineering Research Institute Co., Ltd, Chongqing 401122, China

Keywords:

Autonomous Field Mower, Tree trunk detection, Multi-sensor fusion, PointPillars network, YOLO v8n

Abstract

The increasingly widespread application of autonomous field mowers in agriculture has significantly heightened the demand for precise and reliable tree trunk detection technologies, particularly in complex and challenging operational environments. To overcome the inherent limitations of single-sensor systems, such as the sparse point cloud resolution in Light Detection and Ranging (LiDAR), photometric sensitivity in camera-based methods, and persistent occlusion interference, this study proposes a multi-sensor fusion framework that integrates data from multi-line LiDAR and a monocular camera for robust tree trunk detection. First, a spatio-temporal calibration framework was developed to ensure accurate alignment of multi-source data. Subsequently, the PointPillars network was utilized for efficient extraction of 3D point cloud features, while an improved You Only Look Once Version 8 Nano (YOLOv8n) model was integrated to enable precise 2D image feature extraction. Additionally, the Complete Intersection over Union (CIoU) fusion strategy was adopted to enable effective cross-modal bounding box matching. Experimental results demonstrate that the proposed fusion approach achieves average positioning errors of 0.0619 m in the horizontal direction and 0.0583 m in the vertical direction, along with a tree trunk detection accuracy of 93.68%. This method effectively resolves the false detection issues typically encountered with traditional point cloud clustering algorithms in complex environments, while also mitigating performance degradation in vision-based detection under complex texture conditions. The proposed framework presents an innovative approach to environment-aware perception for autonomous mowing operations.      

Keywords: autonomous field mower, tree trunk detection, multi-sensor fusion, PointPillars network, YOLOv8n

DOI: 10.25165/j.ijabe.20261901.10196

Citation: Ji J, Yang J H, Wang M L, Zhang B H, Liu Y. Enhanced tree trunk detection for the autonomous field mower via LiDAR-camera fusion in complex environments. Int J Agric & Biol Eng, 2026; 19(1): 213–225.

Author Biography

Jie Ji, 1. College of Engineering and Technology, Southwest University, Chongqing 400715, China

Associate Professor

References

[1] Wei P, Yan X J, Yan W T, Sun L, Xu J, Yuan H Z. Precise extraction of targeted apple tree canopy with YOLO-Fi model for advanced UAV spraying plans. Computers and Electronics in Agriculture, 2024; 226: 109425.

[2] Sun J W, Chen Z X, Song R H, Fan S, Han X, Zhang C F, et al. An intelligent self-propelled double-row orchard trenching and fertilizing machine: Modeling, evaluation, and application. Computers and Electronics in Agriculture, 2025; 229: 109818.

[3] Xiang M Q, Gao X M, Wang G, Qi J T, Qu M H, Ma Z Y, et al. An application oriented all-round intelligent weeding machine with enhanced YOLOv5. Biosystems Engineering, 2024; 248: 269–282.

[4] Li Y L, Zhang Z Y, Wang X F, Fu W, Li J B. Automatic reconstruction and modeling of dormant jujube trees using three-view image constraints for intelligent pruning applications. Computers and Electronics in Agriculture, 2023; 212: 108149.

[5] Zhu D J, Xie L Z, Chen B X, Tan J B, Deng R F, Zheng Y Z, et al. Knowledge graph and deep learning based pest detection and identification system for fruit quality. Internet of Things, 2023; 21: 100649.

[6] Cheng X L, Wu X T, Zhu Y F, Zhao Y, Xi B Y, Yan X F, et al. New dielectric-based smart sensor with multi-probe arrays for in-vivo monitoring of trunk water content distribution of a tree in a poplar stand. Computers and Electronics in Agriculture, 2024; 227: 109585.

[7] Zhang Y Y, Zhou J. Laser radar based orchard trunk detection. Journal of China Agricultural University, 2015; 20(5): 249–255. (in Chinese)

[8] Bargoti S, Underwood J P, Nieto J I, Sukkarieh S. A pipeline for trunk detection in trellis structured apple orchards. Journal of Field Robotics, 2015; 32(8): 1075–1094.

[9] Liu W H, He X K, Liu Y J, Wu Z M, Yuan C J, Liu L M, et al. Navigation method between rows for orchard based on 3D LiDAR. Transactions of the Chinese Society of Agricultural Engineering (Transactions of CSAE), 2021; 37(9): 165–174. (in Chinese)

[10] Feng Y H, Su Y J, Wang J T, Yan J B, Qi X T, Maeda E E, et al. L1-Tree: A novel algorithm for constructing 3D tree models and estimating branch architectural traits using terrestrial laser scanning data. Remote Sensing of Environment, 2024; 314: 114390.

[11] Fang H. Large-scale sparse point cloud cylinder recognition and application. Beijing: Beijing Forestry University, Master’s dissertation. 2022; 47p. doi: 10.26949/d.cnki.gblyu.2022.000643. (in Chinese)

[12] Zhang H J, Sun Z L, Qi X C, Cao X P, Ren S, Wang J X. Accurate apple tree trunk recognition method based on improved YOLO v8. Transactions of the Chinese Society for Agricultural Machinery, 2024; 55(S1): 246–255, 262. (in Chinese)

[13] Liu H, Zhu S H, Shen Y, Tang J H. Fast segmentation algorithm of tree trunks based on multi-feature fusion. Transactions of the Chinese Society for Agricultural Machinery, 2020; 51(1): 221–229. (in Chinese)

[14] Peng S B, Chen B Q, Li J B, Fan P X, Liu X Y, Fang X, et al. Detection of the navigation line between lines in orchard using improved YOLOv7. Transactions of the Chinese Society of Agricultural Engineering (Transactions of the CSAE), 2023; 39(16): 131–138. (in Chinese)

[15] Zhang J, Karkee M, Zhang Q, Zhang X, Yaqoob M, Fu L S, et al. Multi-class object detection using faster R-CNN and estimation of shaking locations for automated shake-and-catch apple harvesting. Computers and Electronics in Agriculture, 2020; 173: 105384.

[16] Zhao Y H, Yu T, Liu X X. Urban street tree recognition method based on machine vision. 2021 6th International Conference on Intelligent Computing and Signal Processing (ICSP). IEEE: Xi’an, China, 2021; pp.967–971. https://doi.org/10.1109/ICSP51882.2021.9408928.

[17] He J, He J, Luo X W, Li W C, Man Z X, Feng D W. Rice row recognition and navigation control based on multi-sensor fusion. Transactions of the Chinese Society for Agricultural Machinery, 2022; 53(3): 18–26, 137. (in Chinese)

[18] Xue J L, Fan B W, Yan J, Dong S X, Ding Q S. Trunk detection based on laser radar and vision data fusion. Int J Agric & Biol Eng, 2018; 11(6): 20–26. doi: 10.25165/j.ijabe.20181106.3725.

[19] Sun K, Zhang Y F, Gong J L. Multi-sensor data fusion and navigation line extraction method based on discrete factor. Journal of South China Agricultural University, 2022; 43(5): 92–98. (in Chinese)

[20] Liu Y, Ji J, Zhao L J, Feng W, He Q, Wang X K. Trunk detection method based on fusion of LiDAR and camera data. Journal of Southwest University (Natural Science Edition), 2024; 46(2): 183–196. (in Chinese)

[21] Jiang Q, An D, Hang H Y, Liu J H, Guo Y C, Chen L Q, et al. Maize crop row detection algorithm based on fusion of LiDAR and RGB camera. Transactions of the Chinese Society for Agricultural Machinery, 2024; 55: 263–274. (in Chinese)

[22] Feng X, Li J, Yu C S, Qian J Y, He Y. Extrinsic parameter calibration of LiDAR and camera based on edge correlation point cloud. Application Research of Computers, 2023; 40: 2537–2542. (in Chinese)

[23] Guo L M. Systematic design of mountain orchard management machine. Master’s dissertation. Luoyang: Henan University of Technology, 2023; 76p. doi: 10.27791/d.cnki.ghegy.2023.000769. (in Chinese)

[24] Lang A H, Vora S, Caesar H, Zhou L, Yang J, Beijbom O. PointPillars: Fast encoders for object detection from point clouds. 2019 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: Long Beach, CA, USA, 2019; pp.12689–12697. doi: 10.1109/CVPR.2019.01298.

[25] Liu W, Anguelov D, Erhan D, Szegedy C, Reed S, Fu C Y, et al. Ssd: Single shot multibox detector. Computer Vision-ECCV 2016: 14th European Conference. 2016; pp.21–37. doi: 10.1007/978-3-319-46448-0_2.

[26] Sohan M, Sai Ram T, Rami Reddy C V. A review on Yolov8 and its advancements. International Conference on Data Intelligence and Cognitive Informatics, 2024; pp.529–545. doi: 10.1007/978-981-99-7962-2_39.

[27] Han K, Wang Y H, Tian Q, Guo J Y, Xu C J, Xu C. Ghostnet: More features from cheap operations. 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR). IEEE: Seattle, WA, USA, 2020; pp.1577–1586. doi: 10.1109/CVPR42600.2020.00165

[28] Tang Y H, Han K, Guo J Y, Xu C, Xu C, Wang Y H. GhostNetv2: Enhance cheap operation with long-range attention. Advances in Neural Information Processing Systems, 2022; 35: 9969–9982.

[29] Woo S, Park J, Lee J Y, Kweon I S. CBAM: Convolutional block attention module. Computer Vision-ECCV, 2018; pp.3-19. doi: 10.1007/978-3-030-01234-2.

[30] Jiang A, Noguchi R, Ahamed T. Tree trunk recognition in orchard autonomous operations under different light conditions using a thermal camera and faster R-CNN. Sensors, 2022; 22(5): 2065.

[31] Zheng Z H, Wang P, Liu W, Li J Z, Ye R G, Ren D W. Distance-IoU loss: Faster and better learning for bounding box regression. Proceedings of the AAAI Conference on Artificial Intelligence, 2020; 34(7): 12993–13000.

[32] Yu J H, Jiang Y N, Wang Z Y, Cao Z M, Huang T. Unitbox: An advanced object detection network. Proceedings of the 24th ACM International Conference on Multimedia, 2016; pp.516–520. doi: 10.1145/2964284.2967274.

Downloads

Published

2026-03-16

How to Cite

(1)
Ji, J.; Yang, J.; Wang, M.; Zhang, B.; Liu, Y. Enhanced Tree Trunk Detection for the Autonomous Field Mower via LiDAR-Camera Fusion in Complex Environments. Int J Agric & Biol Eng 2026, 19.

Issue

Section

Information Technology, Sensors and Control Systems