论文:2019,Vol:37,Issue(2):323-329
引用本文:
张君昌, 张登, 万锦锦. 基于多互补特征融合的视频跟踪算法[J]. 西北工业大学学报
ZHANG Junchang, ZHANG Deng, WAN Jinjin. A New Video Tracking Algorithm Based on Multi-Complementary Features Fusion[J]. Northwestern polytechnical university

基于多互补特征融合的视频跟踪算法
张君昌1,2, 张登1, 万锦锦2
1. 西北工业大学 电子信息学院, 陕西 西安 710072;
2. 光电控制技术重点实验室, 河南 洛阳 471000
摘要:
为充分利用跟踪过程中样本信息的多样性,提高跟踪器的泛化能力,在Staple算法的基础上融入了基于轮廓特征的物体性模型预测结果,并针对该算法中对不同预测响应结果简单线性进行加权带来的不确定性,提出一种新的自适应权重系数的响应图融合方法,从而有效地提升了跟踪算法的可靠性。理论分析与实验仿真表明,所提算法在精准度和鲁棒性较经典的Staple算法有着较大提高,并且保持着较高的实时性。
关键词:    相关滤波    目标跟踪    轮廓特征    自适应权重   
A New Video Tracking Algorithm Based on Multi-Complementary Features Fusion
ZHANG Junchang1,2, ZHANG Deng1, WAN Jinjin2
1. School of Electronics and Information, Northwestern Polytechnical University, Xi'an 710072, China;
2. Science and Technology on Electro-Optic Control Laboratory, Luoyang 471000, China
Abstract:
In order to make full use of the diversity of sample information in the tracking process and improve the generalization ability of the tracker, this paper integrates the object model prediction results on the basis of the Staple algorithm, and applies weighted bands to the simple linearity of different predictive response results in the algorithm. To the uncertainties, a new adaptive response factor graph fusion method with weight coefficients is proposed, which effectively improves the reliability of the video target tracking algorithm. Theoretical analysis and experimental simulation show that the proposed algorithm is more accurate and robust than the classical Staple algorithm, and it maintains high real-time performance.
Key words:    correlation filters    object tracking    contour features    adaptive weights    feature fusion   
收稿日期: 2018-04-18     修回日期:
DOI: 10.1051/jnwpu/20193720323
基金项目: 光电控制技术重点实验室航空科学基金(2016515303)资助
通讯作者:     Email:
作者简介: 张君昌(1969-),西北工业大学教授,主要从事图像信号处理研究。
相关功能
PDF(2193KB) Free
打印本文
把本文推荐给朋友
作者相关文章
张君昌  在本刊中的所有文章
张登  在本刊中的所有文章
万锦锦  在本刊中的所有文章

参考文献:
[1] BOLME D S, BEVERIDGE J R, DRAPER B A, et al. Visual Object Tracking Using Adaptive Correlation Filters[C]//Proceedings of 2010 IEEE Conference on Computer Vision and Pattern Recognition. Washington, 2010:2544-2550
[2] DANELLJAN M, HäGER G, KHAN F S, et al. Accurate Scale Estimation for Robust Visual Tracking[C]//Proceedings of the British Machine Vision Conference, Durham, 2014:65.1-65.11
[3] HUANG D. Enable Scale and Aspect Ratio Adaptability in Visual Tracking with Detection Proposals[C]//Proceedings of the British Machine Vision Conference, Durham, 2015:185
[4] ZHU G, PORIKLI F, LI H. Beyond Local Search:Tracking Objects Everywhere with Instance-Specific Proposals[C]//Proceedings of the 2016 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, Washington, 2016:943-951
[5] BERTINETTO L, VALMADRE J, Golodetz S, et al. Staple:Complementary Learners for Real-Time Tracking[C]//Proceedings of 2016 IEEE Conference on Computer Vision and Pattern Recognition, Washington, 2016:1401-1409
[6] ZHANG J, MA S, SCLAROFF S. MEEM:Robust Tracking via Multiple Experts Using Entropy Minimization[C]//Proceedings of the 2014 European Conference on Computer Vision, Berlin, 2014:188-203
[7] ZITNICK C L, DOLLÁR P. Edge Boxes:Locating Object Proposals from Edges[C]//Lecture Notesin Computer Science:8693. Heidelberg:SpringerVerlag, 2014:391-405