Parameter Optimization of SVM Based on Maximum Variance – Entropy Criterion

Article Preview

Abstract:

Kernel parameter selection of support vector machine (SVM) is difficult in practical application. A parameter selection algorithm of SVM was proposed based on data maximum variance - entropy criterion by analyzing the principle of SVM classifier. The algorithm uses data maximum variance - entropy criterion to measure the linear separability of dataset in the feature space, and combines with particle swarm optimization (PSO) algorithm for parameter optimization. The experiment results on datasets from UCI show that the algorithm is excellence in accuracy and improves the training performance of SVM. To further verify the effectiveness of the algorithm, applying the method in fault diagnosis of biquadratic filter circuit, results prove it improves the diagnostic accuracy.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1053-1059

Citation:

Online since:

August 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] VAPNIK V N. The Nature of Statistical Learning Theory [M]. New York: Springer Verlag, (1995).

Google Scholar

[2] C J C Burges. A Tutorial on Support Vector Machines for Pattern Recognition [J]. Data Mining and Knowledge Discovery, 1998, 2(2): 121-167.

Google Scholar

[3] KEERTHI S, CHIH J. Asymptotic Behavior of Support Vector Machines with Gaussian Kernel [J]. Neural Compu tation, 2003, 15: 1667-1689.

DOI: 10.1162/089976603321891855

Google Scholar

[4] SIWEK K, OSOWSKI S, MARKIEWICZ T. Support Vector Machine for Fault Diagnosis in Electrical Circuits[C]/ Proceedings of the 7th Nordic Signal Processing Symposium. Iceland, 2006, 7-9(6): 342-345.

DOI: 10.1109/norsig.2006.275251

Google Scholar

[5] Hsu CW, Chang CC, Lin CJ. A Practical Guide to Support Vector Classification [EB/OL]. 2007, (03): http: /www. csie. ntu. edu. Tw/cjlin/papers/guide. pdf.

Google Scholar

[6] Huang CM, Lee YJ, Lin D KJ, et al. Model Selection for Support Vector Machines via Uniform Design[J]. Computational Statistics and Data Analysis, 2007, 52(1): 335 -346.

DOI: 10.1016/j.csda.2007.02.013

Google Scholar

[7] Hongda Zhang, Xiaodan Wang, Hailong Xu. Pseudo Gradient and Dynamic Step Optimization Algorithm for RBF-SVM Parameter Search [J]. Journal of University of Electronic Science and Technology of China, 2010, 39(4): 523-528(In Chinese).

Google Scholar

[8] Chunhong Zheng, Licheng Jiao, Ailing Ding. Automatic Model Selection for Support Vector Machines Using Heuristic Genetic Algorithm [J]. Control Theory and Applications, 2006, 23(2): 187-192(In Chinese).

Google Scholar

[9] Jiangtao Ren, Shaodong Zhao, Senchan Xu, et al. Simultaneous Feature Selection and SVM Parameters Optimization Algorithm Based on Binary PSO[J]. Computer Science, 2007, 34(6): 179-182(In Chinese).

Google Scholar

[10] Chapelle, V Vapnik. Choosing Multiple Parameters for Support Vector Machines [J]. Machine Learning, 2002, 46: 131-159.

Google Scholar

[11] V Vapnik,O Chapelle.Bounds on Error Expectation for Support Vector Machines[J].Neural Computation,2000, 12(9): 221-231.

DOI: 10.1162/089976600300015042

Google Scholar

[12] Xiaoyu Li, Xinfeng Zhang, Lansun Shen. A Selection Means on the Parameter of Radius Basis Function [J]. ACTA ELECTRONICA SINICA, 2005, 33(12A): 2459 -2463(In Chinese).

Google Scholar

[13] Yaohua Tang, Weimin Guo, Jinghuai Gao. SVM Parameter Selection Algorithm Based on Maximum Kernel Similarity Diversity [J]. PR&AI, 2010, 23(2): 210-215(In Chinese).

Google Scholar

[14] Biao Liu, Chunping Chen, Huamin Feng, et al. A SVM Parameters Selection Algorithm Based on Fisher Criterion [J]. Journal of Shandong University (Natural Science), 2012, 47(7): 50-55(In Chinese).

Google Scholar

[15] Scholkopf B, Smola A, Muller K.R. Kernel Principal Component Analysis[C]/In Advances in Kernel Method Support Vector Learning, Cambridge MA: MIT Press, 1999: 327-352.

DOI: 10.7551/mitpress/1130.003.0026

Google Scholar