论文:2020,Vol:38,Issue(3):471-477
引用本文:
张牧行, 申晓红, 何磊, 王海燕. 一种水下目标识别的最大信息系数特征选择方法[J]. 西北工业大学学报
ZHANG Muhang, SHEN Xiaohong, HE Lei, WANG Haiyan. Feature Selection on Maximum Information Coefficient for Underwater Target Recognition[J]. Northwestern polytechnical university

一种水下目标识别的最大信息系数特征选择方法
张牧行1, 申晓红1, 何磊1, 王海燕1,2
1. 西北工业大学 航海学院, 陕西 西安 710072;
2. 陕西科技大学 电子信息与人工智能学院, 陕西 西安 710021
摘要:
由于未经选择的特征集合中包含的无关特征和冗余特征会导致识别性能和识别效率的下降,特征选择是识别任务中的重要步骤。然而,基于辐射噪声识别水下目标时,由于目标的多样性和水声信道的复杂性,提取的声学特征之间存在多种线性相关之外的复杂关系。针对此问题,以归一化最大信息系数度量特征与类别之间的相关度以及特征之间的冗余度,提出了基于归一化最大信息系数的特征选择方法(NMIC-FS),并在实测数据集上以随机森林和支持向量机等模型估计的平均分类精度评估其性能。水下目标数据分析结果表明,与未选择前相比,NMIC-FS所得特征子集性能在更短的分类时间得到更高的分类正确率。与相关特征选择法、拉普拉斯分数法和套索法等方法相比,NMIC-FS在特征选择过程中能更迅速地提升分类正确率,可用更少的特征得到与使用特征全集时相当的分类正确率。
关键词:    特征选择    舰船辐射噪声    最大相关系数   
Feature Selection on Maximum Information Coefficient for Underwater Target Recognition
ZHANG Muhang1, SHEN Xiaohong1, HE Lei1, WANG Haiyan1,2
1. School of Marine Science and Technology, Northwestern Polytechnical University, Xi'an 710072, China;
2. School of Electronic Information and Artificial Intelligence, Shaanxi University of Science and Technology, Xi'an 710021, China
Abstract:
Feature selection is an essential process in the identification task because the irrelevant and redundant features contained in the unselected feature set can reduce both the performance and efficiency of recognition. However, when identifying the underwater targets based on their radiated noise, the diversity of targets, and the complexity of underwater acoustic channels introduce various complex relationships among the extracted acoustic features. For this problem, this paper employs the normalized maximum information coefficient (NMIC) to measure the correlations between features and categories and the redundancy among different features and further proposes an NMIC based feature selection method (NMIC-FS). Then, on the real-world dataset, the average classification accuracy estimated by models such as random forest and support vector machine is used to evaluate the performance of the NMIC-FS. The analysis results show that the feature subset obtained by NMIC-FS can achieve higher classification accuracy in a shorter time than that without selection. Compared with correlation-based feature selection, laplacian score, and lasso methods, the NMIC-FS improves the classification accuracy faster in the process of feature selection and requires the least acoustic features to obtain classification accuracy comparable to that of the full feature set.
Key words:    feature selection    ship-radiated noise    maximum correlation coefficient   
收稿日期: 2019-07-01     修回日期:
DOI: 10.1051/jnwpu/20203830471
基金项目: 国家重点研发计划(2016YFC1400204)资助
通讯作者:     Email:
作者简介: 张牧行(1985-),女,西北工业大学博士研究生,主要从事特征选择和目标识别研究。
相关功能
PDF(1428KB) Free
打印本文
把本文推荐给朋友
作者相关文章
张牧行  在本刊中的所有文章
申晓红  在本刊中的所有文章
何磊  在本刊中的所有文章
王海燕  在本刊中的所有文章

参考文献:
[1] 杨宏晖, 戴健, 孙进才, 等. 用于水声目标识别的自适应免疫特征选择算法[J]. 西安交通大学学报, 2011, 45(12):28-33 YANG Honghui, DAI Jian, SUN Jincai, et al. A New Adaptive Immune Feature Selection Algorithm for Underwater Acoustic Target Classification[J]. Journal of Xi'an Jiaotong University, 2011, 45(12):28-33(in Chinese)
[2] ALELYANI S, TANG J, LIU H. Feature Selection for Clustering:a Review[M]. New York:CRC Press, 2014
[3] YU L, LIU H. Efficient Feature Selection via Analysis of Relevance and Redundancy[J]. Journal of Machine Learning Research, 2004, 5:1205-1224
[4] TANG J, ALELYANI S, LIU H. Feature Selection for Classification:a Review[M]. New York:CRC Press, 2014
[5] ABDI H, WILLIAMS L J. Principal Component Analysis[J]. WIREs Computational Statistics, 2010, 2(4):433-459
[6] GUYON I, ELISSEEFF A. An Introduction to Variable and Feature Selection[J]. Journal of Machine Learning Research, 2003, 3:1157-1182
[7] 杨宏晖, 孙进才, 袁骏. 基于支持向量机和遗传算法的水下目标特征选择算法[J]. 西北工业大学学报, 2005, 23(4):512-515 YANG Honghui, SUN Jincai, YUAN Hong. A New Method for Feature Selection for Underwater Acoustic Targets[J]. Journal of Northwestern Polytechnical University, 2005, 23(4):512-515(in Chinese)
[8] TIBSHIRANI R. Regression Shrinkage and Selection via the Lasso:a Retrospective[J]. Journal of the Royal Statistical Society:Series B(Statistical Methodology), 2011, 73(3):273-282
[9] ZOU H. The Adaptive Lasso and Its Oracle Properties[J]. Journal of the American Statistical Association, 2006, 101(476):1418-1429
[10] ZOU H, HASTIE T. Regularization and Variable Selection via the Elastic Net[J]. Journal of the Royal Statistical Society:Series B(Statistical Methodology), 2005, 67(2):301-320
[11] CAI J, LUO J, WANG S, et al. Feature Selection in Machine Learning:a New Perspective[J]. Neurocomputing, 2018, 300:70-79
[12] GU Q, LI Z, HAN J. Generalized Fisher Score for Feature Selection[C]//Proceedings of the Twenty-Seventh Conference on Uncertainty in Artificial Intelligence, Arlington, Virginia, 2011:266-273
[13] ZHOU M. A Hybrid Feature Selection Method Based on Fisher Score and Genetic Algorithm[J]. Journal of Mathematical Sciences:Advances and Applications, 2016, 37(1):51-78
[14] HE X, CAI D, NIYOGI P. Laplacian Score for Feature Selection[C]//Proceedings of the 18th International Conference on Neural Information Processing Systems, Cambridge, MA, 2005:507-514
[15] HUANG R, JIANG W, SUN G. Manifold-Based Constraint Laplacian Score for Multi-Label Feature Selection[J]. Pattern Recognition Letters, 2018, 112:346-352
[16] HALL M A. Correlation-Based Feature Selection for Discrete and Numeric Class Machine Learning[C]//Proceedings of the Seventeenth International Conference on Machine Learning, San Francisco, CA, 2000:359-366
[17] MURSALIN M, ZHANG Y, CHEN Y, et al. Automated Epileptic Seizure Detection Using Improved Correlation-Based Feature Selection with Random Forest Classifier[J]. Neurocomputing, 2017, 241:204-214
[18] ZHAO Z, WANG L, LIU H, et al. On Similarity Preserving Feature Selection[J]. IEEE Trans on Knowledge and Data Engineering, 2013, 25(3):619-632
[19] HU L, GAO W, ZHAO K, et al. Feature Selection Considering Two Types of Feature Relevancy and Feature Interdependency[J]. Expert Systems with Applications, 2018, 93:423-434
[20] SANTOS-DOMíNGUEZ D, TORRES-GUIJARRO S, CARDENAL-LóPEZ A, et al. ShipsEar:an Underwater Vessel Noise Database[J]. Applied Acoustics, 2016, 113:64-69
[21] RESHEF D N, RESHEF Y A, FINUCANE H K, et al. Detecting Novel Associations in Large Data Sets[J]. Science, 2011, 334(6062):1518-1524
[22] RESHEF Y A, RESHEF D N, FINUCANE H K. Measuring Dependence Powerfully and Equitably[J]. Journal of Machine Learning Research, 2016, 17(211):1-63