Non-Negative Based Locally Sparse Representation for Classification

Article Preview

Abstract:

In this paper, a new method is proposed, which can be considered as the combination of sparse representation based classification (SRC) and KNN classifier. In detail, with the assumption of locally linear embedding coming into existence, the proposed method achieves the classification goal via non-negative locally sparse representation, combining the reconstruction property and the sparsity of SRC and the discrimination power included in KNN. Compared to SRC, the proposed method has obvious discrimination and is more acceptable for the real image data without those preconditions difficult to satisfy. Moreover, it is more suitable for the classification of low dimensional data dimensionally reduced by dimensionality reduction methods, especially those methods obtaining the low dimensional and neighborhood preserving embeddings of high dimensional data. The experiments on MNIST is also presented, which supports the above arguments.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

502-507

Citation:

Online since:

March 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] M. Aharon, M. Elad, and A. Bruckstein. K-SVD: An algorithm for designing overcomplete dictionaries for sparse representation. IEEE Trans. SP, 54: 4311-21, (2006).

DOI: 10.1109/tsp.2006.881199

Google Scholar

[2] B. Olshausen, P. Sallee, and M. Lewicki. Learning sparse multiscale image representations. In NIPS, 15: 1327-1334, (2003).

Google Scholar

[3] Y.C. Pati, R. Rezaiifar, and P.S. Krishnaprasad. Orthogonal matching pursuit: recursive function approximation with applications to wavelet decomposition. In the twenty seventh Asilomar Conference on Signals, Systems and Computers, 1: 40-44, (1993).

DOI: 10.1109/acssc.1993.342465

Google Scholar

[4] S. Chen, D. L. Donoho, and M. A. Saunders. Atomic decomposition by basis pursuit. SIAM Journal on Scientific Computing, 20(1): 33-61, (1998).

DOI: 10.1137/s1064827596304010

Google Scholar

[5] M. Elad, M. Aharon. Image denoising via sparse and redundant representation over learned dictionaries. IEEE Trans. IP, 15(12): 3736-44, (2006).

DOI: 10.1109/tip.2006.881969

Google Scholar

[6] K. Huang, S. Aviyente. Sparse representation for signal classification. In NIPS, 19: 609-616, (2007).

Google Scholar

[7] J. Wright, A. Yang, S. Sastry, and Y. Ma. Robust face recognition via sparse representation, IEEE Trans. PAMI. 31(2): 210–227, (2009).

DOI: 10.1109/tpami.2008.79

Google Scholar

[8] D. D. Lee, H. S. Seung. Learning the parts of objects by non-negative matrix factorization. Nature, 401(6755): 788–791, (1999).

DOI: 10.1038/44565

Google Scholar

[9] S. T. Roweis, L. K. Saul. Nonlinear dimensionality reduction by locally linear embedding.

Google Scholar

[10] John Wright, Yi Ma, Julien Mairal et al. Sparse Representation for Computer Vision and Pattern Recognition. In CVPR, (2009).

Google Scholar

[11] http: /yann. lecun. com/exdb/mnist.

Google Scholar

[12] I. T. Jolliffe. Principal Component Analysis, Springer, NY, (1989).

Google Scholar

[13] R. A. Fisher. The Use of Multiple Measures in Taxonomic Problems. Ann. Eugenics, 7: 179-188, (1936).

Google Scholar

[14] X. He, P. Niyogi, Locality preserving projections, In NIPS, (2003).

Google Scholar