Parameters Selection of LLE Algorithm for Classification Tasks

Article Preview

Abstract:

The crux in the locally linear embedding algorithm or LLE is the selection of embedding dimensionality and neighborhood size. A method of parameters selection based on the normalized cut criterion or Ncut for classification tasks is proposed. Differing from current techniques based on the neighborhood topology preservation criterion, the proposed method capitalizes on class separability of embedding result. By taking it into consideration, the intrinsic capability of LLE can be more faithfully reflected, and hence more rational features for classification in real-life applications can be offered. The theoretical argument is supported by experimental results from synthetic and real data sets.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

422-427

Citation:

Online since:

October 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] Peng Zhang, Yuanyuan Ren, Bo Zhang, A new embedding quality assessment method for manifold learning, Neurocomputing. 97 (2012) 251-266.

DOI: 10.1016/j.neucom.2012.05.013

Google Scholar

[2] S. T. Roweis, L. K. Saul, Nonlinear dimensionality reduction by locally linear embedding, Science. 290 (5500) (2000) 2323-2326.

DOI: 10.1126/science.290.5500.2323

Google Scholar

[3] Andres Alvarez-Meza, Juliana Valencia-Aguirre, Genaro Daza-Santacoloma, Global and local choice of the number of nearest neighbors in locally linear embedding, Pattern Recognition Letters. 32 (2011) 2171-2177.

DOI: 10.1016/j.patrec.2011.05.011

Google Scholar

[4] L. Chen, A. Buja, Local multidimensional scaling for nonlinear dimension reduction, graph drawing, and proximity analysis, Journal of the American Statistical Association. 104 (485) (2009) 209-219.

DOI: 10.1198/jasa.2009.0111

Google Scholar

[5] L. Chen, Local multidimensional scaling for nonlinear dimension reduction graph layout and proximity analysis, Ph.D. Thesis, University of Pennsylvania, (2006).

Google Scholar

[6] John A. Lee, Michel Verleysen, Quality assessment of dimensionality reduction: Rank-based criteria, Neurocomputing. 72 (2009) 1431-1443.

DOI: 10.1016/j.neucom.2008.12.017

Google Scholar

[7] Maria C.V. Nascimento, Andre C.P.L.F. de Carvalho. Spectral methods for graph clustering-A survey [J]. European Journal of Operational Research 211 (2011) 221-231.

DOI: 10.1016/j.ejor.2010.08.012

Google Scholar

[8] Markus Maiker, Ulrike v. Luxburg and Matthias Hein. Influence of graph construction on graph-based clustering measures. In: NIPS 2008, Vancouver, Canada(2009).

Google Scholar

[9] T.H. Cormen, C.E. Leiserson, R.L. Rivest, and C. Stein, Introduction to Algorithms, Second Edition. McGraw Hill Companics, (2003).

Google Scholar

[10] Zelnik-Manor, L., Perona, P., Self-tuning spectral clustering, In Proceedings of NIPS'04. 1601-1608.

Google Scholar

[11] Semya Elaoud, Jacques Teghem, Taicir Loukil. Multiple crossover genetic algorithm for the multiobjective traveling salesman problem. Electronic Notes in Discrete Mathemaatics, Volume 36, 1 August 2010, 936-946.

DOI: 10.1016/j.endm.2010.05.119

Google Scholar