Research on Multi-Dimensional Bayesian Network Classifiers Based on ICA Dimension Reduction

Article Preview

Abstract:

Multi-dimensional Bayesian network classifiers (MBCs) are probabilistic graphical models proposed to solve classification problems. However, in data analysis and preprocessing tasks, one is often confronted with the problem of selecting features from very high dimensional data. To resolve this problem, the covariance analysis and the FastICA algorithm are applied to decrease the dimension and remove redundant information. And then, we only need to construct class subgraph and bridge subgraph of the MBC model with algorithm and mutual information from the processed data, since the new feature variables satisfy independence assumption. The experiment was tested on three benchmark data sets. The theoretically and experimental results show that our method outperforms other state-of-the-art algorithms for multi-dimensional classification in accuracy.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2593-2596

Citation:

Online since:

August 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Tsoumakas G, Katakis I, Vlahavas I. Mining multi-label data. In: Data mining and knowledge discovery handbook. Springer, (2010).

DOI: 10.1007/978-0-387-09823-4_34

Google Scholar

[2] C. Bielza, G Li, and P Larranaga. Multi-dimensional classification with bayesian networks. International Journal of Approximate Reasoning, 52(2011), 705-727.

DOI: 10.1016/j.ijar.2011.01.007

Google Scholar

[3] Jollife I., 1986. Principal Component Analysis, vol. 1, Springer Series in Statistics. Springer, Berlin.

Google Scholar

[4] Battiti, R., 2002. Using Mutual Information for Selecting Features in Super-vised Neural Net Learning. IEEE Trans. Neural Networks 5, 537-550.

DOI: 10.1109/72.298224

Google Scholar

[5] Peng. H., Long, F., Ding, C. 2005. Feature selection based on mutual information: Criteria of max-dependency, max-relevance, and min-redundancy. IEEE Trans. Pattern Anal. Machine Intell. 27(8), 1226-1238.

DOI: 10.1109/tpami.2005.159

Google Scholar

[6] He, X., Cai, D., Niyogi, P., 2006. Laplacian score for feature selection. Adv. Neural Inform. Process. Systems 18.

Google Scholar

[7] Ji-min Ye, Ting Huang. New Fast-ICA Algorithmsfor Blind Source Separation Without Prewhitening [J]. Communications in Computer and Information Science, 2011, 255(2): 579-585.

DOI: 10.1007/978-3-642-23220-6_73

Google Scholar

[8] Pearl J. Probabilistic reasoning in intelligent systems: networks of plausible inference. Morgan Kaufmann Publishers; (1988).

Google Scholar

[9] Koller D, Friedman N. Probabilistic graphical models principles and techniques. Cambridge: MIT Press; (2009).

Google Scholar

[10] L. van der Gaag, P.R. de Waal, Multi-dimensional Bayesian network classifiers, in: Third European Conference on Probabilistic Graphical Models , 2006, p.107–114.

Google Scholar