Design for Fast Adaboost with Feature Selection

Article Preview

Abstract:

Since the original Adaboost algorithm is very time-consuming in training, we have designed an improved Adaboost, which adds a process of feature selection to the original Adaboost algorithm. After each round of training, we retain the features whose error rate to classify the samples are relatively low and remove the features with high error rate. In this way the time for the next training is reduced, and the whole algorithm is accelerated.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 816-817)

Pages:

566-569

Citation:

Online since:

September 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] K. Levi and Y. Weiss. Learning object detection from a small number of examples: The importance of good features. In Proc. CVPR (Washington, USA, June 27- July 2, 2004). Vol. 2, pp.53-60.

DOI: 10.1109/cvpr.2004.1315144

Google Scholar

[2] A. Blum and P. Langley. Selection of relevant features and examples in machine learning. Artificial Intelligence Vol. 97(1997), pp.245-271.

DOI: 10.1016/s0004-3702(97)00063-5

Google Scholar

[3] Friedman J, Hastie T, Tibshirani R. Additiv logistic gression; A statistical view of boosting. The Annals of Statistics Vol. 28(2000), pp.337-407.

Google Scholar

[4] D. Redpath and K. Lebart. Observations on boosting feature selection. In Proc. Multiple Classifier Systems (Seaside, USA, June 13-15, 2005). Vol. 3541, pp.32-41.

DOI: 10.1007/11494683_4

Google Scholar

[5] R. Collins, Y. Liu, and M. Leordeanu. Online selection of discriminative tracking features. Pattern Analysis and Machine Intelligence, Vol. 27(2005), pp.1631-1643.

DOI: 10.1109/tpami.2005.205

Google Scholar

[6] Freund Y. Boosting a weak learning algorithm by majority. Informatin and Computation Vol. 121(1995), pp.256-285.

DOI: 10.1006/inco.1995.1136

Google Scholar

[7] R. Schapire, Y. Freund, P. Bartlett, and W. Lee. Boosting the margin: A new explanation for the effectiveness of voting methods. In Proc. ICML (Madison, USA, July 24-27, 1998). Vol. 26, pp.1651-1686.

DOI: 10.1214/aos/1024691352

Google Scholar

[8] R. Schapire. The boosting approach to machine learning: An overview. In Proc. MSRI Workshop on Nonlinear Estimation and Classification (Berkeley, USA, March, 2001).

Google Scholar

[9] Schapire R E, Singer Y. BoosTexter: A boosting-based system for text categorization. Machine Learning Vol. 39(2000), pp.135-168.

Google Scholar

[10] Viola P, Jones M. Robust real-time face detection. International Journal of Computer Vision Vol. 57(2004), pp.137-154.

DOI: 10.1023/b:visi.0000013087.49260.fb

Google Scholar

[11] A. Opelt, A. Pinz, M. Fussenegger. Generic object recognition with boosting. Pattern Analysis and Machine Intelligence Vol. 28(2006), pp.416-431.

DOI: 10.1109/tpami.2006.54

Google Scholar