首页 | 本学科首页   官方微博 | 高级检索  
     检索      

基于3D激光雷达的鸡舍通道中心线检测方法
引用本文:韩雨晓,李帅,王宁,安娅军,张漫,李寒.基于3D激光雷达的鸡舍通道中心线检测方法[J].农业工程学报,2024,40(9):173-181.
作者姓名:韩雨晓  李帅  王宁  安娅军  张漫  李寒
作者单位:中国农业大学智慧农业系统集成研究教育部重点实验室,北京 100083;中国农业大学农业农村部农业信息获取技术重点实验室,北京 100083
基金项目:中国农业大学“双一流”科研专项建设项目;中国农业大学2115人才培育发展支持计划;国家自然科学基金项目(32171893)
摘    要:针对笼养鸡舍环境下光照强度弱、作业通道内狭小导致机器人巡检时通道中心线检测困难的问题,该研究利用3D激光雷达对鸡舍通道中心线进行获取。首先通过机器人搭载的3D激光雷达对鸡舍作业通道信息进行采集,利用直通滤波、地面点滤波、体素滤波、统计滤波和平面投影对获取的3D激光雷达点云数据进行预处理,获取XOY平面上的点云数据。通过改变K-means聚类初始点选择方式和聚类函数对预处理后的点云数据进行数据分类。利用改进RANSAC算法对分类后的数据进行处理,提取通道中心线。试验结果表明该研究提出的改进K-means聚类算法平均耗时6.98 ms,相较于传统的K-means聚类算法平均耗时减少了29.40 ms,准确率提高了82.41%。该研究提出的改进RANSAC算法中心线提取准确率为93.66%、平均误差角为0.89°、平均耗时为3.94 ms,比LSM算法得到的平均误差角高0.14°,平均耗时减少6.15 ms。表明该研究提出的鸡舍通道中心线检测方法基本满足笼养鸡舍环境实时自主导航的需求,为巡检机器人在鸡舍作业通道内进行激光雷达导航提供了技术支撑。

关 键 词:导航  机器人  激光雷达  鸡舍  K-means聚类    随机抽样一致性算法  中心线拟合
收稿时间:2024/1/15 0:00:00
修稿时间:2024/4/16 0:00:00

Detecting center line of chicken coop path using 3D Lidar
HAN Yuxiao,LI Shuai,WANG Ning,AN Yajun,ZHANG Man,LI Han.Detecting center line of chicken coop path using 3D Lidar[J].Transactions of the Chinese Society of Agricultural Engineering,2024,40(9):173-181.
Authors:HAN Yuxiao  LI Shuai  WANG Ning  AN Yajun  ZHANG Man  LI Han
Institution:Key Laboratory of smart Agriculture System, Ministry of Education, Beijing 100083, China;Key Laboratory of Agricultural Information Acquisition Technology, Ministry of Agriculture and Rural Affairs, China Agricultural University, Beijing 100083, China
Abstract:Navigation of robots has been required to accurately identify the working channel line in the narrow paths inside chicken coops under low lighting conditions. This study aims to detect the centerline of the chicken coop path using 3D LiDAR. Then, 3D LiDAR was equipped on the robot body to collect path information within the operation channel. Various preprocessing techniques were applied, such as direct filtering, ground point filtering, voxel filtering, statistical filtering, and point cloud projection. The point cloud data was obtained to classify in the region of interest (ROI) of the 3D LiDAR on the XOY plane, according to the size of the vertical axis. The center points of the left and right clusters were then selected from the two sets of point cloud datasets after rough classification. The distance between the transverse axis and point was used as the clustering function of k-means clustering to classify the left and right point clouds. Then, the longitude and latitude scanning and the secondary edge methods were used to extract the edges of the two clustered point clouds. The RANSAC was combined to calculate the fitting line equation of the channel edge. The path centerline of the operation channel was extracted using these two equations. An inspection robot was developed for the cage chicken housed to serve as the experimental platform. VLP-16 LiDAR was selected as the perception sensor to conduct the field verification in the D10 and D13 chicken houses of Deqingyuan Co., Ltd. (Beijing, China). The experimental results showed that the improved k-means clustering took an average time of 6.98ms, with a silhouette coefficient of 0.59, a Rand index of 1.00, a clustering success rate of 84.10%, and a clustering accuracy of 100%. The average time was reduced by 29.40 ms, while the contour coefficient, the rand index, and the accuracy increased by 0.04, 0.63, and 82.41%, respectively, compared with the traditional. The success rate was slightly reduced by 0.41%. The best performance was achieved in both the initial point selection in the clustering function, rather than one single condition. The improved RANSAC shared the accuracy of 93.66% for the centerline extraction and an average error angle of 0.89°, which was 0.14 ° higher than the LSM. The average time (3.94 ms) was reduced by 6.15 ms, compared with the LSM. The improved RANSAC showed a much higher accuracy than before, when the number of iterations was set to 100. Furthermore, the maximum and average absolute error angle were both smaller than before. The improved model can be expected to detect the centerline of chicken coop paths, effectively meeting the actual requirements of real-time autonomous navigation in the cage-style chicken coop environments. The finding can provide the technical support to the autonomous navigation of detection robots in the operation channels of chicken coop.
Keywords:navigation  robot  Lidar  chicken coop  K-means  RANSAC  center line fitting
点击此处可从《农业工程学报》浏览原始摘要信息
点击此处可从《农业工程学报》下载免费的PDF全文
设为首页 | 免责声明 | 关于勤云 | 加入收藏

Copyright©北京勤云科技发展有限公司  京ICP备09084417号