|国家科技期刊平台
首页|期刊导航|智慧农业(中英文)|基于激光雷达与IMU融合的农业机器人定位方法

基于激光雷达与IMU融合的农业机器人定位方法OACSTPCD

Localization Method for Agricultural Robots Based on Fusion of LiDAR and IMU

中文摘要英文摘要

[目的/意义]精准可靠的定位技术是智能农业机器人开展自主导航作业的重要前提,而常用的全球卫星导航系统(Global Navigation Satellite System,GNSS)定位方法在农业环境中容易受到树木遮挡、电磁干扰等因素影响,因而,提出一种基于三维激光雷达(Light Detection and Ranging,LiDAR)与惯性测量单元(Inertial Measure-ment Unit,IMU)信息融合的农业机器人定位方法.[方法]首先,利用基于角度的聚类方法对激光雷达点云数据进行信息处理,并与三维正态分布变换(3D Normal Distribution Transform,3D-NDT)定位算法相结合,在先验点云地图信息基础上实现基于激光雷达的实时定位;其次,为了克服单传感器定位方法的局限性,利用扩展卡尔曼滤波(Extended Kalman Filter,EKF)算法对激光雷达定位信息与IMU里程计信息进行融合,进一步提升农业机器人的定位精度.最后,分别在机器人操作系统(Robot Operating System,ROS)的Gazebo仿真环境中,以及真实作业场景中进行实验,验证提出的定位算法的有效性.[结果和讨论]融合定位方法在仿真环境中的纵向和横向平均定位误差分别为1.7和1.8 cm,而在实验中的纵向和横向平均定位误差分别为3.3和3.3 cm,均小于传统3D-NDT定位算法的定位误差.[结论]提出的融合定位方法能够满足农业机器人在弱GNSS环境下自主作业的定位要求,为农业机器人提供了一种新的定位方法.

[Objective]High-precision localization technology serves as the crucial foundation in enabling the autonomous navigation operations of intelligent agricultural robots.However,the traditional global navigation satellite system(GNSS)localization method faces numer-ous limitations,such as tree shadow,electromagnetic interference,and other factors in the agricultural environment brings challenges to the accuracy and reliability of localization technology.To address the deficiencies and achieve precise localization of agricultural ro-bots independent of GNSS,a localization method was proposed based on the fusion of three-dimensional light detection and ranging(LiDAR)data and inertial measurement unit(IMU)information to enhance localization accuracy and reliability. [Methods]LiDAR was used to obtain point cloud data in the agricultural environment and realize self-localization via point cloud matching.By integrating real-time motion parameter measurements from the IMU with LiDAR data,a high-precision localization so-lution for agricultural robots was achieved through a specific fusion algorithm.Firstly,the LiDAR-obtained point cloud data was pre-processed and the depth map was used to save the data.This approach could reduce the dimensionality of the original LiDAR point cloud,and eliminate the disorder of the original LiDAR point cloud arrangement,facilitating traversal and clustering through graph search.Given the presence of numerous distinct crops like trees in the agricultural environment,an angle-based clustering method was adopted.Specific angle-based clustering criteria were set to group the point cloud data,leading to the segmentation of different clus-ters of points,and obvious crops in the agricultural environment was effectively perceived.Furthermore,to improve the accuracy and stability of positioning,an improved three-dimensional normal distribution transform(3D-NDT)localization algorithm was proposed.This algorithm operated by matching the LiDAR-scanned point cloud data in real time with the pre-existing down sampled point cloud map to achieve real-time localization.Considering that direct down sampling of LiDAR point clouds in the agricultural environment could result in the loss of crucial environmental data,a point cloud clustering operation was used in place of down sampling operation,thereby improving matching accuracy and positioning precision.Secondly,to address potential constraints and shortcomings of using a single sensor for robot localization,a multi-sensor information fusion strategy was deployed to improve the localization accuracy.Specifically,the extended Kalman filter algorithm(EKF)was chosen to fuse the localization data from LiDAR point cloud and the IMU odometer information.The IMU provided essential motion parameters such as acceleration and angular velocity of the agricultur-al robot,and by combining with the LiDAR-derived localization information,the localization of the agricultural robot could be more accurately estimated.This fusion approach maximized the advantages of different sensors,compensated for their individual limita-tions,and improved the overall localization accuracy of the agricultural robot. [Results and Discussions]A series of experimental results in the Gazebo simulation environment of the robot operating system(ROS)and real operation scenarios showed that the fusion localization method proposed had significant advantages.In the simulation envi-ronment,the average localization errors of the proposed multi-sensor data fusion localization method were 1.7 and 1.8 cm,respective-ly,while in the experimental scenario,these errors were 3.3 and 3.3 cm,respectively,which were significantly better than the tradition-al 3D-NDT localization algorithm.These findings showed that the localization method proposed in this study could achieve high-pre-cision localization in the complex agricultural environment,and provide reliable localization assistance for the autonomous function-ing of agricultural robots. [Conclusions]The proposed localization method based on the fusion of LiDAR data and IMU information provided a novel localiza-tion solution for the autonomous operation of agricultural robots in areas with limited GNSS reception.Through the comprehensive utilization of multi-sensor information and adopting advanced data processing and fusion algorithms,the localization accuracy of agri-cultural robots could be significantly improved,which could provide a new reference for the intelligence and automation of agricultur-al production.

刘洋;冀杰;潘登;赵立军;李明生

西南大学 工程技术学院,重庆 400715,中国中国汽车工程研究院股份有限公司,重庆 401122,中国重庆文理学院智能制造工程学院,重庆 402160,中国

计算机与自动化

农业机器人激光雷达定位点云匹配扩展卡尔曼滤波传感器融合

agricultural robotsLiDAR localizationpoint cloud matchingextended Kalman filtersensors fusion

《智慧农业(中英文)》 2024 (003)

94-106 / 13

重庆市研究生科研创新项目(CYS23207);重庆市科学技术局农业农村领域重点研发项目(cstc2021jscx-gksbX0003);重庆市教育委员会科学技术研究项目(KJZD-M202201302);重庆市科技局创新发展联合基金项目(CSTB2022NSCQ-LZX0024) Chongqing Graduate Research and Innovation Project(CYS23207);Chongqing Science and Technology Bureau Key R&D Project in Agriculture and Rural Areas(cstc2021jscx-gksbX0003);Chongqing Municipal Education Commission Science and Technology Research Project(KJZD-M202201302);Chongqing Science and Technology Bureau Innovation and Development Joint Fund Project(CSTB2022NSCQ-LZX0024)

10.12133/j.smartag.SA202401009

评论