论文:2019,Vol:37,Issue(1):152-159
引用本文:
王森, 王润孝, 左欣欣, 于薇薇. 多频飞行时间相机实时深度补偿算法研究[J]. 西北工业大学学报
WANG Sen, WANG Runxiao, ZUO Xinxin, YU Weiwei. Real-Time Artifact Compensation for Depth Images of Multi-Frequency ToF[J]. Northwestern polytechnical university

多频飞行时间相机实时深度补偿算法研究
王森1, 王润孝1, 左欣欣2, 于薇薇1
1. 西北工业大学 机电学院, 陕西 西安 710072;
2. 西北工业大学 计算机学院, 陕西 西安 710072
摘要:
针对多频飞行时间相机在曝光时间内物体运动以及相位融合中产生的偏差,提出一种基于光流以及空间核密度估计的实时补偿算法。首先获取原始的多频相位图像,计算出不同频率之间的相对光流,获得每种频率下的补偿相位;在此基础上针对不同频率的相位做出可信度假设并利用空间核密度估计对其进行排名,获得最终的相位融合图并生成对应的深度图。实验部分,使用Kinect V2相机获取的原始多频相位图对算法进行验证,同时利用GPU进行并行加速;结果表明该算法可以实时对物体平行于光轴运动的相位融合偏差进行补偿,有效地提高了深度图像的精度和稳定性。
关键词:    飞行时间相机    多频    光流    空间核密度估计    Kinect V2   
Real-Time Artifact Compensation for Depth Images of Multi-Frequency ToF
WANG Sen1, WANG Runxiao1, ZUO Xinxin2, YU Weiwei1
1. School of Mechanical Engineering, Northwestern Polytechnical University, Xi'an 710072, China;
2. School of Computer Science and Technology, Northwestern Polytechnical University, Xi'an 710072, China
Abstract:
During the last few years, Time-of-Flight(TOF) sensor achieved a significant impact onto research and industrial fields due to that it can capture depth easily. For dynamic scenes and phase fusion, ToF sensor's working principles can lead to significant artifacts, therefore an efficient method to combine motion compensation and kernel density estimate multi-frequency unwrapping is proposed. Firstly, the raw multi-phase images are captured, then calculate the optical flow between each frequency. Secondly, by generating multiple depth hypotheses, uses a spatial kernel density estimation is used to rank them with wrapped phase images. Finally, the accurate depth from fused phase image is gotten. The algorithm on Kinect V2 is validated and the pixel-wise part is optimized using GPU. The method shows its real time superior performance on real datasets.
Key words:    ToF    multi-frequency    optical flow    kernel density estimation    Kinect V2   
收稿日期: 2018-03-06     修回日期:
DOI: 10.1051/jnwpu/20193710152
基金项目: 国家自然科学基金(51475373,61603302)、陕西省科技统筹创新工程项目(2016KTZDGY06-01)、陕西省自然科学基金(2016JQ6009)与111引智计划(B13044)资助
通讯作者:     Email:
作者简介: 王森(1988-),西北工业大学博士研究生,主要从事机器人视觉研究。
相关功能
PDF(4156KB) Free
打印本文
把本文推荐给朋友
作者相关文章
王森  在本刊中的所有文章
王润孝  在本刊中的所有文章
左欣欣  在本刊中的所有文章
于薇薇  在本刊中的所有文章

参考文献:
[1] ZUO X, WANG S, ZHENG J, et al. High-Speed Depth Stream Generation from a Hybrid Camera[C]//ACM on Multimedia Conference, 2016:878-887
[2] LINDNER M, KOLB A. Lateral and Depth Calibration of PMD-Distance Sensors[C]//International Symposium on Visual Computing, 2006:524-533
[3] HUSSMANN S, HERMANSKI A, EDELER T. Real-Time Motion Artifact Suppression in TOF Camera Systems[J]. IEEE Trans on Instrumentation & Measurement, 2011, 60(5):1682-1690
[4] LINDNER M, KOLB A. Compensation of Motion Artifacts for Time-of-Flight Cameras[C]//Dagm 2009 Workshop on Dynamic 3D Imaging, Springer-Verlag, 2009:16-27
[5] LEFLOCH D, HOEGG T, KOLB A. Real-Time Motion Artifacts Compensation of ToF Sensors Data on GPU[C]//Three Dimensional Imaging, Visualization, and Distday International Society for Optics and Photonics, 2013:87380U-87380U
[6] CRABB R, MANDUCHI R. Probabilistic Phase Unwrapping for Single-Frequency Time-of-Flight Range Cameras[C]//International Conference on 3D Vision, 2014:577-584
[7] JONGENELEN A P P, BAILEY D G, PAYNE A D, et al. Analysis of Errors in ToF Range Imaging with Dual-Frequency Modulation[J]. IEEE Trans on Instrumentation & Measurement, 2011, 60(5):1861-1868
[8] DROESCHEL D, HOLZ D, BEHNKE S. Multi-Frequency Phase Unwrapping for Time-of-Flight Cameras[C]//IEEE/RSJ International Conference on Intelligent Robots and Systems, 2010:1463-1469
[9] TI C, YANG R, DAVIS J. Single-Shot Time-of-Flight Phase Unwrapping Using Two Modulation Frequencies[C]//International Conference on 3D Vision, 2016:667-675
[10] LAWIN F J, FORSSÉN P E, OVRÉN H. Efficient Multi-Frequency Phase Unwrapping Using Kernel Density Estimation[C]//European Conference on Computer Vision, Springes, 2016:170-185
[11] JÄHNE B. Theoretical and Experimental Error Analysis of Continuous-Wave Time-of-Flight Range Cameras[J]. Optical Engineering, 2009, 48(1):3602
[12] LUCAS B D, KANADE T. An Iterative Image Registration Technique with an Application to Stereo Vision(DARPA)[J]. Nutrient Cycling in Agroecosystems, 1981, 83(1):13-26
[13] XIANG L, ECHTLER F. libfreenect2:Release 0.2[EB/OL]. (2016-04-28)[2018-02-22]. http://github.com/OpenKinect/libfreenect2
[14] BROX T, MALIK J. Large Displacement Optical Flow:Descriptor Matching in Variational Motion Estimation[J]. IEEE Trans on Pattern Analysis & Machine Intelligence, 2011, 33(3):500-13
[15] FELSBERG M, FORSSEN P E, SCHARR H. Efficient Robust Smoothing of Low-Level Signal Features[J]. IEEE Trans on Pattern Analysis & Machine Intelligence, 2009, 28(2):209-22