Classification Algorithm for Naïve Bayes Based on Validity and Correlation

Article Preview

Abstract:

Naïve Bayes classification algorithm based on validity (NBCABV) optimizes the training data by eliminating the noise samples of training data with validity to improve the effect of classification, while it ignores the associations of properties. In consideration of the associations of properties, an improved method that is classification algorithm for Naïve Bayes based on validity and correlation (CANBBVC) is proposed to delete more noise samples with validity and correlation, thus resulting in better classification performance. Experimental results show this model has higher classification accuracy comparing the one based on validity solely.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

1609-1612

Citation:

Online since:

February 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Friedman N, Geiger D, Goldszmidt M. Machine Learning, 1997, 29(2-3): 131~163.

Google Scholar

[2] Langley P, Iba W, Thompson K. An analysis of Bayesian classifiers. In: Rosenbloom P, Szolovits P, eds. Proc. of the 10th National Conf. on Artificial Intelligence. Menlo Park: AAAI Press, 1992. 223~228.

Google Scholar

[3] Qin Feng, Ren Shi-liu, Cheng Ze-Kai, et al. Computer Engineering and Application, 2008, 44(6): 107-109, in Chinese.

Google Scholar

[4] Kononenko I. Seminaive Bayesian classifier. In: Kodratoff Y, ed. Proc. of the 6th European Working Session on Learning. New York: Springer-Verlag, 1991. 206~219.

Google Scholar

[5] Pazzani MJ. In: Fisher D, Lenz HJ, eds. Learning from Data: Artificial Intelligence and Statistics V. New York: Springer-Verlag. 1996. 239~248.

Google Scholar

[6] Langley P, Sage S. Induction of selective Bayesian classifiers. In: Mantaras RL, Poole DL, eds. Proc. of the 10th Conf. On Uncertainty in Artificial Intelligence. San Francisco: Morgan Kaufmann Publishers, 1994. 399~406.

DOI: 10.1016/b978-1-55860-332-5.50055-9

Google Scholar

[7] Webb GI, Pazzani MJ. Adjusted probability naive Bayesian induction. In: Antoniou G, Slaney JK, eds. Proc. of the 11th Australian Joint Conf. on Artificial Intelligence. Berlin: Springer-Verlag, 1998. 285~295.

DOI: 10.1007/bfb0095060

Google Scholar

[8] Kohavi R. Scaling up the accuracy of Naive-Bayes classifiers: A decision-tree hybrid. In: Simoudis E, Han J, Fayyad UM, eds. Proc. of the 2nd Int'l Conf. on Knowledge Discovery and Data Mining. Menlo Park: AAAI Press, 1996. 202~207.

Google Scholar

[9] Shi Hong-Bo, Wang Zhi-Hai, Huang Hou-Kuan, Li Xiao-Jian. Journal of Software, 2004. 193~199, in Chinese.

Google Scholar

[10] Zhu, Xiaodan; Su, Jinsong; Wu, Qingfeng; Dong, Huailin. Naive Bayes classification algorithm based on optimized training data. Advanced Materials Research, 2012. v490-495, pp.460-464.

DOI: 10.4028/www.scientific.net/amr.490-495.460

Google Scholar

[11] I. Rish. An Empirical Study of the Naive Bayes Classifier. In Proceedings of IJCAI-01 workshop on Empirical Methods in AI", pages 41–46, Sicily, Italy, (2001).

Google Scholar

[12] Harry Z, Sheng S. Learning Weighted Naive Bayes with Accurate Ranking . In proc. of ICDM 2004, in Chinese.

Google Scholar

[13] Cheng Ke-fei, Zhang Cong. Computer Simulation. 2006, 23(10):92-94, in Chinese.

Google Scholar

[14] Blake C L, Merz C J. UCI repository of machine learning databases[R/OL]. http: /www. ics. uci. edu/_mlearn/MLRepos-itory. html. (1998).

Google Scholar