IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Robustness and Predictive Performance of Homogeneous Ensemble Feature Selection in Text Classification

Robustness and Predictive Performance of Homogeneous Ensemble Feature Selection in Text Classification
View Sample PDF
Author(s): Poornima Mehta (Jaypee Institute of Information Technology, Noida, India)and Satish Chandra (Jaypee Institute of Information Technology, Noida, India)
Copyright: 2021
Volume: 11
Issue: 1
Pages: 15
Source title: International Journal of Information Retrieval Research (IJIRR)
Editor(s)-in-Chief: Zhongyu Lu (University of Huddersfield, UK)
DOI: 10.4018/IJIRR.2021010104

Purchase

View Robustness and Predictive Performance of Homogeneous Ensemble Feature Selection in Text Classification on the publisher's website for pricing and purchasing information.

Abstract

The use of ensemble paradigm with classifiers is a proven approach that involves combining the outcomes of several classifiers. It has recently been extrapolated to feature selection methods to find the most relevant features. Earlier, ensemble feature selection has been used in high dimensional, low sample size datasets like bioinformatics. To one's knowledge there is no such endeavor in the text classification domain. In this work, the ensemble feature selection using data perturbation in the text classification domain has been used with an aim to enhance predictability and stability. This approach involves application of the same feature selector to different perturbed versions of training data, obtaining different ranks for a feature. Previous works focus only on one of the metrics, that is, stability or accuracy. In this work, a combined framework is adopted that assesses both the predictability and stability of the feature selection method by using feature selection ensemble. This approach has been explored on univariate and multivariate feature selectors, using two rank aggregators.

Related Content

Upendra Kumar. © 2024. 31 pages.
B. Subbulakshmi, C. Deisy, S. Parthasarathy. © 2023. 21 pages.
Vikram Singh. © 2023. 22 pages.
Diksha Malhotra, Rajesh Bhatia, Manish Kumar. © 2023. 13 pages.
Reshu Agarwal, Adarsh Dixit. © 2023. 14 pages.
Ravindra Kumar Singh. © 2023. 21 pages.
Bhavana Bansal, Aparajita Nanda, Anita Sahoo. © 2022. 17 pages.
Body Bottom