Shape Classification Using Multiple Classifiers with Different Feature Sets

Article Preview

Abstract:

In this paper, a new shape classification method based on different feature sets using multiple classifiers is proposed. Different feature sets are derived from the shapes by using different extraction methods. The implements of feature extraction are based on two ways: Fourier descriptors and Zernike moments. Multiple classifiers comprise Normal densities based linear classifier, k-nearest neighbor classifier, Feed-Forward neural network, Radial Basis Function neural network classifier. Each classifier is trained by two feature sets respectively to form two classification results. The final classification results are a combined response of the individual classifier using six different classifier combination rules and the results were compared with those derived from multiple classifiers based on the same feature sets and individual classifier. In this study we examined the different classification tasks on Kimia dataset. For the tasks the best combination strategy was found using the product rule, giving an average recognition rate of 95.83%.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 368-373)

Pages:

1583-1587

Citation:

Online since:

October 2011

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2012 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] K Wall and PE Danielsson: Computer Vision, Graphics, and Image Processing (1984), pp.220-227.

Google Scholar

[2] Baruch O. and Murray H.L.: Pattern Recognition (1988), pp.581-589.

Google Scholar

[3] S.R. Dubois and F.H. Glanz: IEEE Trans. Pattern Anal. Machine Intell. 8:55-66 (1986).

Google Scholar

[4] Zahn C.T. and Roskies R.Z.: IEEE Transactions on Computers, C-21:269-281(1972).

Google Scholar

[5] Costa L. da F. and Sandler M.B.: Pattern Recognition Letters, 10(5):329-334(1989).

Google Scholar

[6] Kim W.Y.; Kim Y.SA: Signal Processing: Image Communication, 16(1): 95-102(2000).

Google Scholar

[7] Duda, R.O., Hart, P.E., and Stork, D.G. Pattern Classification, 2d Edition Wiley, New York (2001).

Google Scholar

[8] Xu, L.; Krzyzak, A.; Suen, C.Y.; and C. Y. Suen.:IEEE Transactions on Systems, Man and Cybernetics, 23(3):418–435(1992).

DOI: 10.1109/21.155943

Google Scholar

[9] K Chen, L Wang and HS Chi.: International Journal of Pattern Recognition and Artificial Intelligence, 11(3): 417-445(1997).

Google Scholar

[10] ZQ Zhao, DS Huang and BY Sun.: Pattern Recognition Letters (2004).

Google Scholar

[11] J Kittler, M. Hatef, R. P. W Duin and J. Matas.: IEEE Transactions on Pattern Analysis and Machine Intelligence, 20(3):226–239(1998).

DOI: 10.1109/34.667881

Google Scholar

[12] Partridge, D.: Neural Networks 9: 263-271(1996).

Google Scholar

[13] S.Jeannin and M.Bober. Technical Report ISO/IEC JTC 1/SC 29/WG 11 MPEG99/N2690, MPEG-7, Seoul (1999).

Google Scholar

[14] Y. LeCun, L. Bottou, Y. Bengio and P. Haffner. Gradient-based Learning Applied to Document Recognition. Proceedings of the IEEE, 86(11): 2278-2324(1998)

DOI: 10.1109/5.726791

Google Scholar