Home People Research Publications Demos
News Jobs Prospective
About Internal

Cost Sensitive Boosting

A novel framework is proposed for the design of cost-sensitive boosting algorithms. The framework is based on the identification of two necessary conditions for optimal cost-sensitive learning that 1) expected losses must be minimized by optimal cost-sensitive decision rules and 2) empirical loss minimization must emphasize the neighborhood of the target cost-sensitive boundary. It is shown that these conditions enable the derivation of cost-sensitive losses that can be minimized by gradient descent, in the functional space of convex combinations of weak learners, to produce novel boosting algorithms. The proposed framework is applied to the derivation of cost-sensitive extensions of AdaBoost, RealBoost, and LogitBoost. Experimental evidence, with a synthetic problem, standard data sets, and the computer vision problems of face and car detection, is presented in support of the cost-sensitive optimality of the new algorithms. Their performance is also compared to those of various previous cost-sensitive boosting proposals, as well as the popular combination of large-margin classifiers and probability calibration. Cost-sensitive boosting is shown to consistently outperform all other methods.

Experimental Results:

Fraud detection
Medical diagnosis
Business decisions

Many classification problems such as fraud detection, business decision making and medical diagnosis are naturally cost sensitive. These require cost-sensitive extensions of state of the art learning techniques.


Object Detection:
Face Detection


Car Detection


Pedestrian Detection     

Publications: Cost-Sensitive Boosting.
Hamed Masnadi-Shirazi and Nuno Vasconcelos
IEEE Trans. on Pattern Analysis and Machine Intelligence,
vol. 32(2), 294, March 2010 .
IEEE [ps] [pdf]

Risk minimization, probability elicitation, and cost-sensitive SVMs
Hamed Masnadi-Shirazi and Nuno Vasconcelos.
International Conference on Machine Learning (ICML), 2010.
(acceptance rate 20%)

Asymmetric Boosting
Hamed Masnadi-Shirazi and Nuno Vasconcelos
Proceedings of International Conference on Machine Learning (ICML),
Corvallis, OR, May 2007.

Contact: Nuno Vasconcelos, Hamed Masnadi-Shirazi


Copyright @ 2007 www.svcl.ucsd.edu