An Improved Feature Selection Algorithm Based on Parzen Window and Conditional Mutual Information

Article Preview

Abstract:

In this paper, an improved feature selection algorithm by conditional mutual information with Parzen window was proposed, which adopted conditional mutual information as an evaluation criterion of feature selection in order to overcome the deficiency of feature redundant and used Parzen window to estimate the probability density functions and calculate the conditional mutual information of continuous variables, in such a way as to achieve feature selection for continuous data.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

2614-2619

Citation:

Online since:

August 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Guyon I, Elisseeff A. An introduction to variable and feature selection[J]. Journal of Machine Learning Research, 2003, 3: 1157-1182.

Google Scholar

[2] Qu G, Hariri S, Yousif M. A new dependency and correlation analysis for features[J]. IEEE Transactions on Knowledge and Data Engineering, 2005, 17(9): 1199-1207.

DOI: 10.1109/tkde.2005.136

Google Scholar

[3] R. Battiti, Using Mutual Information for Selecting Features in Supervised Neural Net Learning, IEEE Trans. Neural Networks, vol. 5, no. 4, pp.537-550, July (1994).

DOI: 10.1109/72.298224

Google Scholar

[4] Liu H, Sun J, Liu L, et al. Feature selection with dynamic mutual information[J]. Pattern Recgonition, 2009, 42(7): 1330-1339.

DOI: 10.1016/j.patcog.2008.10.028

Google Scholar

[5] N. Kwak and C. -H. Choi, Input Feature Selection for Classification Problems, IEEE Trans. Neural Networks, vol. 13, no. 1, pp.143-159, Jan. (2002).

DOI: 10.1109/72.977291

Google Scholar

[6] Sotoca J, Pla F. Supervised feature selection by clustering using conditional mutual information based distances[J]. Pattern Recognition, 2010, 43(6): 2068-(2081).

DOI: 10.1016/j.patcog.2009.12.013

Google Scholar

[7] T.M. Cover and J.A. Thomas, Elements of Information Theory. John Wiley &Sons, (1991).

Google Scholar

[8] Cover T, Thomas J. Elements of Information Theory[M]. NewYork: Wiley, (1991).

Google Scholar

[9] Kwak N, Choi C H. Input feature selection by mutual information based on Parzen window[J]. Transactions on Pattern Analysis and Machine Intelligence, 2002, 24(12).

DOI: 10.1109/tpami.2002.1114861

Google Scholar

[10] Quinlan R. C4. 5: Programs for Machine Learning[M]. SanFrancisco: Morgan Kaufmann, (1993).

Google Scholar

[11] BIAN Zhao-qi, ZHANG Xue-gong. Pattern Recognition, Second Edition. [M] . Beijing: Tsinghua University Press, (2001).

Google Scholar

[12] Richard O. Duba, Peter E. Hart, David G. Stork. Pattern Classification, Second Edition [M]. John Wiley & Sons, Inc, (2001).

Google Scholar

[13] Jiawei Han, Micheline Kamber. Data Mining: Concepts and Techniques, Second Edition [M]. Elsevier Inc, (2006).

Google Scholar

[14] SONG Guo-Jie, TANG Shi-We, YANG Dong-Qing, WANG Teng-Jiao. A Spatial Feature Selection Method Based on Maximum Entropy Theory [J]. Journal of Software, 2003, 14(9):1544-1550.

Google Scholar

[15] HUANG Jin-Jie, LV Ning, LI Shuang-Quan, CAI Yun-ze. Feature selection for classification analysis based on information-theoreric criteria[J]. ACTA Automatica Sinica, 2008, 34(3).

DOI: 10.3724/sp.j.1004.2008.00383

Google Scholar