论文:2020,Vol:38,Issue(6):1339-1344
引用本文:
张钊, 张轶, 黄瑞. 带关键帧和可靠平面表示的激光定位算法[J]. 西北工业大学学报
ZHANG Zhao, ZHANG Yi, HUANG Rui. A New SLAM Algorithm with Key-Frame Estimation and Local Map Upgrade Scheme[J]. Northwestern polytechnical university

带关键帧和可靠平面表示的激光定位算法
张钊, 张轶, 黄瑞
四川大学 计算机学院, 四川 成都 610000
摘要:
激光实时定位与地图构建(simultaneous localization and mapping,SLAM)是创建地图和实时导航的重要手段之一,也是无人驾驶不可缺少的一环。针对目前激光SLAM算法对特征匹配可靠性不足、配准误差较大等问题,基于平面拟合算法,提出一种局部地图改进和关键帧估计的方案。具体包括以下3个方面:①局部地图匹配规则和平面表示方法;②局部地图更新方案;③关键帧筛选机制。该方法解决了目前激光定位方案中缺乏关键帧估计,以及局部地图中平面多向性问题,实验表明该方法使得局部地图能够保留更具多样性的激光雷达帧,同时平面表示和匹配也更为可靠。
关键词:    SLAM    点云    定位算法    关键帧    无人驾驶   
A New SLAM Algorithm with Key-Frame Estimation and Local Map Upgrade Scheme
ZHANG Zhao, ZHANG Yi, HUANG Rui
School of Computer Science, Sichuan University, Chengdu 610000, China
Abstract:
Simultaneous Localization and Mapping(SLAM) is the most important tool in creating map and auto-navigation,which is an indispensable link for pilotless automobile. The current SLAM algorithms suffer from unreliable feature matching and large registration error. To reduce those deficiencies,we proposed a key-frame estimation and local map upgrade scheme,which include the following 3 parts:1) Local map matching strategy; 2) Local map updating scheme; 3) Key-frame selecting scheme. Experimental results proved that our scheme improved the performance of current localization methods.
Key words:    laser SLAM    point cloud    localization algorithm    key-frame    auto-drive   
收稿日期: 2020-04-23     修回日期:
DOI: 10.1051/jnwpu/20203861339
通讯作者: 张轶(1981-),四川大学副教授,主要从事计算机视觉、机器智能研究。e-mail:yi.zhang@scu.edu.cn     Email:yi.zhang@scu.edu.cn
作者简介: 张钊(1998-),四川大学硕士研究生,主要从事计算机视觉研究。
相关功能
PDF(2441KB) Free
打印本文
把本文推荐给朋友
作者相关文章
张钊  在本刊中的所有文章
张轶  在本刊中的所有文章
黄瑞  在本刊中的所有文章

参考文献:
[1] MUR-ARTAL R, TARDOS J D. ORB-SLAM2:an Open-Source SLAM System for Monocular,Stereo,and RGB-D Cameras[J]. IEEE Trans on Robotics, 2017, 33(5):1255-1262
[2] ENGEL J, KOLTUN V, CREMERS D. Direct Sparse Odometry[J]. IEEE Trans on Pattern Analysis & Machine Intelligence,2017, 40(3):611-625
[3] QIN Tong, LI Peiliang, SHEN Shaojie. VINS-Mono:a Robust and Versatile Monocular Visual-Inertial State Estimator[J]. IEEE Trans on Robotics, 2018, 34(4):1004-1020
[4] ZHANG J, SINGH S. Low-Drift and Real-Time Lidar Odometry and Mapping[J]. Autonomous Robots, 2017,41(2):401-416
[5] ZHANG J, SINGH S. Laser-Visual-Inertial Odometry and Mapping with High Robustness and Low Drift[J]. Journal of Field Robotics,2018,35(8):1242-1264
[6] ZHANG J, SINGH S. Visual-Lidar Odometry and Mapping:Low-Drift, Robust, and Fast[C]//2015 IEEE International Conference on Robotics and Automation, 2015:2174-2181
[7] GENEVA P, ECKENHOFF K, YANG Y, et al. LIPS:Lidar-Inertial 3D Plane Slam[C]//2018 IEEE/RSJ International Conference on Intelligent Robots and Systems, 2018:123-130
[8] YE H, CHEN Y, LIU M. Tightly Coupled 3D Lidar Inertial Odometry and Mapping[C]//2019 International Conference on Robotics and Automation, 2019:3144-3150
[9] DESCHAUD J E. IMLS-SLAM:Scan-to-Model Matching Based on 3D Data[C]//2018 IEEE International Conference on Robotics and Automation, 2018:2480-2485
[10] GENEVA P, MALEY J, HUANG G. An Efficient Schmidt-EKF for 3D Visual-Inertial SLAM[C]//Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, 2019:12105-12115
[11] WAN G, YANG X, CAI R, et al. Robust and Precise Vehicle Localization Based on Multi-Sensor Fusion in Diverse City Scenes[C]//2018 IEEE International Conference on Robotics and Automation, 2018:4670-4677
[12] MUR-ARTAL R, TARDÓS J D. Visual-Inertial Monocular SLAM with Map Reuse[J]. IEEE Robotics and Automation Letters, 2017, 2(2):796-803