Research on Communication Individual Identification Method Based on PCA-NCA and CV-SVM

In recent years, high-dimensional data has often appeared in the fields of science and industry, such as computer vision, pattern recognition, biological information, and aerospace. Feature dimension reduction and selection are the process of reducing data from high dimensionality to low dimensionality to reveal the nature of the data. In the field of wireless communication, in view of the feature redundancy caused by the high-dimensional features of wireless device startup transient signals, this paper converts the high-dimensional features of signals into low-dimensional features that are conducive to classification through the feature dimensionality reduction and selection method based on PCA-NCA. In addition, this paper also carried out parameter optimization for SVM classifier, and the established CV-SVM classifier improved the classification performance. This paper also carries out simulation devices on the measured start-up signals of ten identical walkie-talkies. When the SNR is greater than 0 dB, the recognition accuracy of the PCA-NCA algorithm is 10% higher than recognition accuracy of the PCA algorithm alone; when the SNR is greater than 10 dB.

[1]  Kezhi Mao,et al.  Fast orthogonal forward selection algorithm for feature subset selection , 2002, IEEE Trans. Neural Networks.

[2]  S. Billings,et al.  Feature Subset Selection and Ranking for Data Dimensionality Reduction , 2007 .

[3]  Yi Liu,et al.  FS_SFS: A novel feature selection method for support vector machines , 2006, Pattern Recognit..

[4]  Ravinder Singh,et al.  Fast-Find: A novel computational approach to analyzing combinatorial motifs , 2006, BMC Bioinformatics.

[5]  Weina Fu,et al.  Distributional Escape Time Algorithm Based on Generalized Fractal Sets in Cloud Environment , 2015 .

[6]  Arun Kumar Sangaiah,et al.  Nucleosome Positioning With Fractal Entropy Increment of Diversity in Telemedicine , 2018, IEEE Access.

[7]  Michael I. Jordan,et al.  Distance Metric Learning with Application to Clustering with Side-Information , 2002, NIPS.

[8]  Huan Liu,et al.  Efficient Feature Selection via Analysis of Relevance and Redundancy , 2004, J. Mach. Learn. Res..

[9]  M.J. Martin-Bautista,et al.  A survey of genetic feature selection in mining issues , 1999, Proceedings of the 1999 Congress on Evolutionary Computation-CEC99 (Cat. No. 99TH8406).

[10]  Chaoyang Zhang,et al.  A Fourier Transformation based Method to Mine Peptide Space for Antimicrobial Activity , 2006, BMC Bioinformatics.

[11]  Timothy J. Hazen,et al.  Dimensionality reduction for speech recognition using neighborhood components analysis , 2007, INTERSPEECH.

[12]  Samuel J. Davey,et al.  A Comparison of Detection Performance for Several Track-before-Detect Algorithms , 2008, 2008 11th International Conference on Information Fusion.

[13]  Hari M. Srivastava,et al.  Parallel Fractal Compression Method for Big Video Data , 2018, Complex..

[14]  Xuegong Zhang,et al.  Recursive SVM feature selection and sample classification for mass-spectrometry and microarray data , 2006, BMC Bioinformatics.

[15]  Anil K. Jain,et al.  Statistical Pattern Recognition: A Review , 2000, IEEE Trans. Pattern Anal. Mach. Intell..

[16]  Amir Globerson,et al.  Metric Learning by Collapsing Classes , 2005, NIPS.

[17]  Geoffrey E. Hinton,et al.  Neighbourhood Components Analysis , 2004, NIPS.