论文:2015,Vol:33,Issue(4):639-643
引用本文:
申昇, 杨宏晖, 王芸, 潘悦, 唐建生. 联合互信息水下目标特征选择算法[J]. 西北工业大学学报
Shen Sheng, Yang Honghui, Wang Yun, Pan Yue, Tang Jiansheng. Joint Mutual Information Feature Selection for Underwater Acoustic Targets[J]. Northwestern polytechnical university

联合互信息水下目标特征选择算法
申昇1, 杨宏晖1,2, 王芸1, 潘悦2, 唐建生2
1. 西北工业大学 航海学院, 陕西 西安 710072;
2. 中国船舶工业系统工程研究院, 北京 100036
摘要:
在特征选择算法中,穷举特征选择算法可选择出最优特征子集,但由于计算量过高而在实际中不可实现。针对计算成本和最优特征子集搜索之间的平衡问题,提出一种新的用于水下目标识别的联合互信息特征选择算法。这个算法的核心思想是:利用顺序向前特征搜索机制,在选择出与类别具有最大互信息特征的条件下,选择具有更多互补分类信息的特征,从而达到快速去除噪声特征和冗余特征及提高识别性能的目的。利用4类实测水下目标数据进行仿真实验,结果表明:在支持向量机识别正确率几乎不变的情况下,联合互信息特征选择方法可以减少87%的特征,分类时间降低58%。与基于支持向量机和遗传算法结合的特征选择方法相比,可以选出更少的特征,特征子集具有更好的泛化性能。
关键词:    特征选择    水下目标识别    联合互信息    条件互信息   
Joint Mutual Information Feature Selection for Underwater Acoustic Targets
Shen Sheng1, Yang Honghui1,2, Wang Yun1, Pan Yue2, Tang Jiansheng2
1. College of Marine Engineering, Northwestern Polytechnical University, Xi'an 710072, China;
2. Systems Engineering Research Institute, Beijing 100036, China
Abstract:
The existing exhaustive feature selection algorithms can select the optimal feature subset of an underwater acoustic target but cannot be used in engineering practices because of their too high computational cost. To balance the computational cost and the optimal feature subset search, we propose what we believe to be a new joint mutual information feature selection (JMIFS) algorithm. Its core consists of: we use the sequence forward feature search mechanism to select the feature that shows the largest amount of mutual information for classification and then select the feature that contributes more mutual information that is complementary to the selected feature so as to remove the noise and redundant features of the underwater acoustic target and enhance the recognition performance. We simulate the selection of multi-field features of four classes of underwater acoustic targets. The simulation results show preliminarily that: on the condition that the recognition accuracy of the SVM classifier declines only 1%, our JMIFS algorithm can reduce about 87% of the redundant features, and its classification time decreases by 58%. Compared with the SVM and genetic algorithm hybrid feature selection algorithms, the JMIFS algorithm selects a smaller number of feature subsets that have a better generalization performance.
Key words:    algorithms    classification(of information)    computational efficiency    experiments    feature extraction    flowcharting    genetic algorithms    optimization    redundancy    support vector machines    targets    underwater acoustics    joint mutual information feature selection(JMIFS)   
收稿日期: 2015-03-17     修回日期:
DOI:
基金项目: 水声对抗技术重点实验室开放基金资助
通讯作者:     Email:
作者简介: 申昇(1990—),西北工业大学博士研究生,主要从事机器学习和水下目标识别研究。
相关功能
PDF(1328KB) Free
打印本文
把本文推荐给朋友
作者相关文章
申昇  在本刊中的所有文章
杨宏晖  在本刊中的所有文章
王芸  在本刊中的所有文章
潘悦  在本刊中的所有文章
唐建生  在本刊中的所有文章

参考文献:
[1] 杨宏晖, 王芸, 孙进才, 等. 融合样本选择与特征选择的AdaBoost支持向量机集成算法[J]. 西安交通大学学报, 2014, 48(12):63-68 Yang Honghui, Wang Yun, Sun Jincai, et al. An AdaBoost Support Vector Machine Ensemble Method with Integration of Instance Selection and Feature Selection[J]. Journal of Xi'an Jiaotong University, 2014, 48(12):63-68 ( in Chinese)
[2] 王磊, 彭圆, 林正青, 等. 听觉外周计算模型在水中目标分类识别中的应用[J]. 电子学报, 2012, 40(1) : 199-203 Wang Lei, Peng Yuan, Lin Zhengqing, et al. The Application of Computational Auditory Peripheral Model in Underwater Target Classification[J]. Acta Electronica Sinica, 2012, 40(1) : 199-203 ( in Chinese)
[3] Peng Yuan. A Study on Several Feature Selection Methods in Target Classification and Recognition[C]//IEEE Computer Science and Automation Engineering, Shanghai, 2011 : 736-739
[4] 杨宏晖,孙进才,袁骏.基于支持向量机和遗传算法的水下目标特征选择算法[J]. 西北工业大学学报, 2005, 23(4):512-515 Yang Honghui, Sun Jincai, Yuan Jun. A New Method for Feature Selection for Underwater Acoustic Targets[J]. Journal of Northwestern Polytechnical University. 2005,23(4):512-515 ( in Chinese)
[5] Cover T M, Thomas J A. Elements of Information Theory[M]. New Jersey: John Wiley & Sons, 2012
[6] Peng H, Long F, Ding C. Feature Selection Based on Mutual Information Criteria of Max-Dependency, Max-Relevance, and Min-Redundancy[J]. IEEE Trans on Pattern Analysis and Machine Intelligence, 2005, 27(8): 1226-1238
[7] Kumar G, Kumar K. A Novel Evaluation Function for Feature Selection Based Upon Information Theory[C]//Electrical and Computer Engineering (CCECE), Canada, 2011: 000395-000399