An Incremental Locally Linear Embedding Algorithm with Non-Negative Constraints of the Weights

Article Preview

Abstract:

Locally Linear Embedding (LLE) is a batch method. When new sample is added, the whole algorithm must be run repeatedly and all the former computational results are discarded. In the paper, the LLE algorithm processing on new sample points is analyzed. For the insufficient precision of the processing of traditional incremental LLE, an incremental LLE algorithm based on non-negative constraints of the weights is proposed. Non-negative constraints of linear weights are imposed on the new sample points in the projection process. The simple fitting of the original algorithm from the engineering application is avoided by the proposed algorithm and the problem of the constantly updating of the whole manifold is solved at the case of new samples being added. Compared with the traditional incremental LLE method, S-curve simulation data and engineering examples analysis show the feasibility and effectiveness of the proposed algorithm.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

478-484

Citation:

Online since:

September 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Jianbo Yu. Local and Nonlinear Preserving Projection for Bearing Defeat Classification and Performance Assessment [J]. IEEE Transactions on Industrial Electronics, 2012, 59(5): 2363-2376.

DOI: 10.1109/tie.2011.2167893

Google Scholar

[2] Gashler, M, Ventura, D, Martinez, T. Manifold Learning by Graduated Optimization [J]. Systems, Man, and Cybernetics, Part B: Cybernetics, IEEE Transactions on, 2011, 41(6): 1458-1470.

DOI: 10.1109/tsmcb.2011.2151187

Google Scholar

[3] Jiang Quansheng, Lu Jiayun, JiaMinping. Nonlinear manifold learning based fault classification method [C]. Control Conference(CCC), 2010 29th Chinese, 2010, 2972-2976.

DOI: 10.1109/ccdc.2010.5498459

Google Scholar

[4] Tenenbaum J B, de Silva V, Langford J C. A global geometric framework for nonlinear dimensionality reduction [J]. Science, 2000, 290: 2319-2323.

DOI: 10.1126/science.290.5500.2319

Google Scholar

[5] Roweis S T, Lawrance K S. Nonlinear dimensionality reduction by locally linear embedding [J]. Science, 2000, 290: 2323-2326.

DOI: 10.1126/science.290.5500.2323

Google Scholar

[6] Belkin M, Niyogi P. Laplacianeigenmaps for dimensionality reduction and data representation [J]. Neural Computation, 2003, 15(6): 1373-1396.

DOI: 10.1162/089976603321780317

Google Scholar

[7] Zhu Minghan, LuoDayong, Yi Liqun, et al. Incremental Locally Linear Embedding Algorithm Based on Orthogonal Iteration Method [J]. Acta Electronica Sinica, 2009, 1(1): 132-136.

Google Scholar

[8] Law M, Jain A K. Incremental nonlinear dimensional reduction by manifold learning [J]. IEEE Trans. On PAMI, 2006, 28(3): 377-391.

Google Scholar

[9] Jie Yu, Qi Tian. Semantic Subspace Projections and Its Applications in Image Retrieval [J. Circuits and Systems for Video Technology, IEEE Transactions on, 2008, 18(4): 544-548.

DOI: 10.1109/tcsvt.2008.918763

Google Scholar

[10] Olga Kouropteva, Oleg Okun, MattiPietiknen. Incremental locally linear embedding [J]. Pattern Recognition, 2005, 38: 1764-1767.

DOI: 10.1016/j.patcog.2005.04.006

Google Scholar