IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Improving Techniques for Naïve Bayes Text Classifiers

Improving Techniques for Naïve Bayes Text Classifiers
View Sample PDF
Author(s): Han-joon Kim (University of Seoul, Korea)
Copyright: 2009
Pages: 17
Source title: Handbook of Research on Text and Web Mining Technologies
Source Author(s)/Editor(s): Min Song (New Jersey Institute of Technology, USA)and Yi-Fang Brook Wu (New Jersey Institute of Technology, USA)
DOI: 10.4018/978-1-59904-990-8.ch007

Purchase

View Improving Techniques for Naïve Bayes Text Classifiers on the publisher's website for pricing and purchasing information.

Abstract

This chapter introduces two practical techniques for improving Naïve Bayes text classifiers that are widely used for text classification. The Naïve Bayes has been evaluated to be a practical text classification algorithm due to its simple classification model, reasonable classification accuracy, and easy update of classification model. Thus, many researchers have a strong incentive to improve the Naïve Bayes by combining it with other meta-learning approaches such as EM (Expectation Maximization) and Boosting. The EM approach is to combine the Naïve Bayes with the EM algorithm and the Boosting approach is to use the Naïve Bayes as a base classifier in the AdaBoost algorithm. For both approaches, a special uncertainty measure fit for Naïve Bayes learning is used. In the Naïve Bayes learning framework, these approaches are expected to be practical solutions to the problem of lack of training documents in text classification systems.

Related Content

. © 2023. 34 pages.
. © 2023. 15 pages.
. © 2023. 15 pages.
. © 2023. 18 pages.
. © 2023. 24 pages.
. © 2023. 32 pages.
. © 2023. 21 pages.
Body Bottom