An Improved Directed Acyclic Graph Support Vector Machines Based on Inter-Cluster Separability Measure in Feature Space

Article Preview

Abstract:

For disadvantage of the methods that are used to evaluate inter-cluster separability measure, a novel separability measure is proposed and applied to directed acyclic graph support vector machine. The distance between cluster centers and distribution of samples in feature space are both considered by the algorithm. Firstly, use hyper-sphere support vector machine to obtain minimal bounding hyper-sphere of each cluster, according to the radius and centers of minimal bounding hyper-spheres, introduce the concept of inter-cluster separability measure in feature space, get the matrix of inter-cluster separability measure according to the concept, finally construct the directed acyclic graph according to the matrix. The experimental results show that the algorithm has higher classification precision, comparing with old directed acyclic graph support vector machine.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2002-2006

Citation:

Online since:

November 2012

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2012 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] V. Vapnik: The Nature of Statistical Learning Theory(Springer, New York, 1995).

Google Scholar

[2] D. Zhuang, B.Y. Zhang and Q. Yang: Efficient Text Classification by Weighted Proximal SVM. In: Proc. of the 5th IEEE International Conference on Data Mining, IEEE, New York(2005), pp.538-545.

DOI: 10.1109/icdm.2005.56

Google Scholar

[3] K. Hotta: Support Vector Machine with Local Summation Kernel for Robust Face Recognition. In Proc. of the 17th International Conference on Pattern Recognition, IEEE, New York(2004). pp.482-485.

DOI: 10.1109/icpr.2004.1334571

Google Scholar

[4] A. Ganapathiraju, J.E. Hamaker and J. Picone: Application of Support Vector Machines to Speech Recognition. IEEE Transaction on signal processing, Vol. 8 (2004) No. 52. pp.2348-2355.

DOI: 10.1109/tsp.2004.831018

Google Scholar

[5] U.G. Krebel: Pairwise Classification and Support Vector Machines. In Advances in Kernel Methods: Support Vector Learning, MIT press, Cambridge(1999). pp.255-268.

DOI: 10.7551/mitpress/1130.003.0020

Google Scholar

[6] K.P. Bennett: Combining Support Vector and Mathematical Programming Methods for Classification. In Advances in Kernel Methods: Support Vector Learning, MIT press, Cambridge(1999). pp.307-326.

DOI: 10.7551/mitpress/1130.003.0025

Google Scholar

[7] J.C. Platt and N. Cristianini, J. Shawe-Taylor: Large Margin DAGs for multiclass classification. In Advances in Neural Information Processing Systems, MIT Press, Cambridge(2000). pp.547-553.

Google Scholar

[8] C.W. Hsu and C. J. Lin: A Comparison of Methods for Multiclass Support Vector Machines. IEEE Transactions on Neural Networks, Vol. 2(2002) No. 13. pp.415-425.

DOI: 10.1109/72.991427

Google Scholar

[9] F. Takahashi and S. Abe: Decision-Tree-Based Multi-Class Support Vector Machines. In Proc. of ICONIP02, IEEE Press, Singapore(2002). pp.1419-1422.

Google Scholar

[10] Z.H. Shi X.D. Wang S.M. Zhao and J.X. Yang: An Improved Algorithm for SVM Decision Tree. Journal of Air Force Engineering University(Natural Science Edition). Vol. 2(2006) No. 7. pp.32-35.

Google Scholar

[11] L.M. Maneivitz and M. Yousef: One-Class SVMs For Document Classification. Jounal of Machine Leaning Reasearch, Vol. 2(2002). pp.139-154.

Google Scholar

[12] M.L. Zhu and P. Yang: Multi-class Incremental Learning Based on Support Vector Machines. Computer Engineering, Vol. 17(2006), No. 32. pp.77-79.

Google Scholar