IRMA-International.org: Creator of Knowledge
Information Resources Management Association
Advancing the Concepts & Practices of Information Resources Management in Modern Organizations

Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost

Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost
View Sample PDF
Author(s): Kai Ming Ting (Monash University, Australia)
Copyright: 2002
Pages: 27
Source title: Heuristic and Optimization for Knowledge Discovery
Source Author(s)/Editor(s): Hussein A. Abbass (University of New South Wales, Australia), Charles S. Newton (University of New South Wales, Australia)and Ruhul Sarker (University of New South Wales, Australia)
DOI: 10.4018/978-1-930708-26-6.ch003

Purchase

View Cost-Sensitive Classification Using Decision Trees, Boosting and MetaCost on the publisher's website for pricing and purchasing information.

Abstract

This chapter reports results obtained from a series of studies on costsensitive classification using decision trees, boosting algorithms, and MetaCost which is a recently proposed procedure that converts an errorbased algorithm into a cost-sensitive algorithm. The studies give rise to new variants of algorithms designed for cost-sensitive classification, and provide insights into the strength and weaknesses of the algorithms. First, we describe a simple and effective heuristic of converting an error-based decision tree algorithm into a cost-sensitive one via instance weighting. The cost-sensitive version performs better than the error-based version that employs a minimum expected cost criterion during classification. Second, we report results from a study on four variants of cost-sensitive boosting algorithms. We find that boosting can be simplified for costsensitive classification. A new variant which excludes a factor used in ordinary boosting has an advantage of producing smaller trees and different trees for different scenarios; while it performs comparably to ordinary boosting in terms of cost. We find that the minimum expected cost criterion is the major contributor to the improvement of all cost-sensitive adaptations of ordinary boosting. Third, we reveal a limitation of MetaCost. We find that MetaCost retains only part of the performance of the internal classifier on which it relies. This occurs for both boosting and bagging as its internal classifier.

Related Content

Murray Eugene Jennex. © 2020. 29 pages.
Ronald John Lofaro. © 2020. 18 pages.
Mark E. Nissen. © 2020. 23 pages.
Ronel Davel, Adeline S. A. Du Toit, Martie Mearns. © 2020. 32 pages.
Murray Eugene Jennex. © 2020. 23 pages.
Michael J. Zhang. © 2020. 21 pages.
Toshali Dey, Susmita Mukhopadhyay. © 2020. 23 pages.
Body Bottom