A Feature Selection Method Based on Information Theory and Genetic Algorithm

Document Type : Original Article


1 Master, Department of Computer Engineering, Qom University of Technology, Qom, Iran.

2 Professor, Department of Industrial Engineering, University of Qom, Qom, Iran

3 P.hD., Student, Faculty of Industrial Engineering, Iran University of Science and Technology, Tehran, Iran.


Purpose: When dealing with high-dimensional datasets, dimensionality reduction is a crucial preprocessing step to achieve high accuracy, efficiency, and scalability in classification problems. This research aims to introduce a feature selection method for high-dimensional datasets by employing dimensionality reduction and genetic algorithms.
Method: In this study, an innovative algorithm has been developed to determine the mutual information between features and the target class using a new criterion. In this method, new characteristics are generated through the combination or transformation of the original characteristics. In this manner, the multi-dimensional space is transformed into a new space with fewer dimensions. In addition to considering the new criterion of mutual information, a genetic algorithm has been employed to enhance the speed of the proposed method.
Findings: The performance of this method has been evaluated on datasets of varying dimensions, with the number of features ranging from 13 to 60. The proposed method has been evaluated in comparison to similar methods, focusing on classification accuracy. The results have been promising.
Conclusion: The proposed method has been applied using MRMR, DISR, JMI, and NJMIM methods on various datasets. The average accuracies obtained from the proposed method are 65.32%, 74.51%, 70.88%, and 58.2%, indicating the efficiency of the proposed method. According to the results obtained, the proposed method outperformed DISR, JMI, NJMIM, and MRMR on average, except for the sonar data set, where the sonar data set yielded better results than the proposed method.


Main Subjects

Amiri, F. & et al. (2011). Mutual information-based feature selection for intrusion detection systems. Journal of network and computer applications, 33(4): 1184-119.
Battiti, R. (1994). Using mutual information for selecting features in supervised neural net learning. IEEE Transactions on Neural Networks, 5(4): 537-550.
Blum, A.L. & Langley, P. (1997). Selection of relevant features and examples in machine learning. Artificial intelligence, 97(1): 245–271.
Boukharouba, A. &  Bennia, A. (2017). Novel feature extraction technique for the recognition of handwritten digits. Applied Computing and Informatics, 13(1): 19-26.
Cellucci, C.J., Albano, A.M. & Rapp, P.E. (2005). Statistical validation of mutual information calculations: comparisons of alternative numerical algorithms. Physical Review, E 71: 1-14.
Dash, M. & Liu, H. (1997). Feature Selection for Classification. Intelligent Data Analysis, No. 1: 131-156.
Fleuret, F. (2004). Fast binary feature selection with conditional mutual information. Journal of Machine Learning Research, 5: 1531-1555.
Gao, W., Hu, L. & Zhang, P. (2018). Class-specific mutual information variation for feature selection. Pattern Recognition, 79: 328-339.
Guyon, I. & Elisseeff, A. (2003). An introduction to variable and feature selection. The Journal of Machine Learning Research, 3: 1157-1182.
Hall, M.A. (1999). Correlation-based Feature Selection for Machine Learning. P.hD. Thesis. Philosophy at The University of Waikato, Hamilton, NewZealand.
Hicks, Y., Setchi, R. & Bennasar, M. (2015). Feature selection using Joint Mutual Information Maximisation. Expert Systems with Applications, 42(22): 8520-8532.
Hoquea, N., Bhattacharyyaa, D.K. & Kalitab, J.K. (2014). MIFS-ND: A Mutual Information-based Feature Selection Method. Expert systems with applications, 41(14): 6371-6385.
Kira, K. & Rendell, L.A. (1992). The feature selection problem: Traditional methods and a new algorithm. In: Proceedings of Ninth National Conference on Artificial Intelligence: 129–134.
Kramer, O. (2013). Dimensionality Reduction with Unsupervised Nearest Neighbors. Intelligent Systems Reference Library: 33-52. Springer-Verlag Berlin Heidelberg.
Kramer, O. (2013). Dimensionality Reduction with Unsupervised Nearest Neighbors. Intelligent Systems Reference Library 51: 33-52. Springer-Verlag Berlin Heidelberg.
Kwak, N. & Choi, C.-H. (2002). Input feature selection for classification problems. IEEE Transactions on Neural Networks, 13(1): 143–159.
Lewis, D.D. (1992). Feature selection and feature extraction for text categorization. In: Proceedings of speech and natural language workshop, Morgan Kaufmann: 212–217.
Peng, H., Long, F. & Ding, C. (2005). Feature selection based on mutual information criteria of max dependency, max-relevance, and min-redundancy. IEEE Transactions on Pattern Analysis and Machine Intelligence, 27(8): 1226–1238.
Shannon, C.E. (1948). A Mathematical Theory of Communication. Bell System Technical Journal, 27(3): 379-423.
Vergara, J.R. & Estevez, P.A. (2014). A review of feature selection methods based on mutual information. Neural Comput Appl, 24: 175-186.
Yang, H.H. & Moody, J. (2000). Data visualization and feature selection: New algorithms for nongaussian data. Adv Neur In. 12: 687-693.