Cheeger Cut Model for the Balanced Data Classification Problem

Article Preview

Abstract:

In this paper we propose a numerical method based on the splitting strategy to solve the Cheeger cut model. In order to improve the classification results, we propose a new self-tuning strategy to choose a robust scaling parameter. Some numerical examples are arranged to illustrate the efficiency of our proposed method.

You might also be interested in these eBooks

Info:

Periodical:

Advanced Materials Research (Volumes 765-767)

Pages:

730-734

Citation:

Online since:

September 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] J. Aujol. Some first-order algorithms for total variation based image restoration. Journal of Mathematical Imaging and Vision, 34(3): 307-327, (2009).

DOI: 10.1007/s10851-009-0149-y

Google Scholar

[2] A. Bermudez and C. Moreno. Duality methods for solving variational inequalities. Computers \& Mathematics with Applications, 7(1): 43-58, (1981).

DOI: 10.1016/0898-1221(81)90006-7

Google Scholar

[3] X. Bresson, X. Tai, T. Chan, and A. Szlam. Multi-class transductive learning based on L1 relaxations of Cheeger cut and Mumford-Shah-Potts model. UCLA CAM Report 12-03.

DOI: 10.1007/s10851-013-0452-5

Google Scholar

[4] T. Buehler and M. Hein. Spectral clustering based on the graph p-Laplacian. In Proceedings of the 26th International Conference on Machine Learning, 81-88, (2009).

Google Scholar

[5] F. Chung. Spectral Graph Theory. America Mathematics Society, (1997).

Google Scholar

[6] W. Dinkelbach. On nonlinear fractional programming. Management Science, 13(7): 492-498, (1967).

DOI: 10.1287/mnsc.13.7.492

Google Scholar

[7] M. Hein and T. Buehler. An inverse power method for nonlinear eigenproblems with applications in 1-spectral clustering and sparse PCA. In Advances in Neural Information Processing Systems, 847-855, (2010).

Google Scholar

[8] M. Hein and S. Setzer. Beyond spectral clustering - tight relaxations of balanced graph cuts. In Advances in Neural Information Processing Systems, 24: 2366-2374, (2011).

Google Scholar

[9] D. Luo, H. Huang, C. Ding, and F. Nie. On the eigenvectors of p-Laplacian. Machine Learning, 81(1): 37-51, (2010).

Google Scholar

[10] J. Shi and J. Malik. Normalized cuts and image segmentation. IEEE Transactions Pattern Analysis and Machine Intelligence, 22(8): 888-905, (2000).

DOI: 10.1109/34.868688

Google Scholar

[11] K. Streib and J. Davis. Using Ripley's K-function to improve graph-based clustering techniques. IEEE Conference on Computer Vision and Pattern Recognition, 2305-2312, (2011).

Google Scholar

[12] A. Szlam and X. Bresson. Total variation and Cheeger cuts. In Proceedings of the 27th International Conference on Machine Learning, 1039-1046, (2010).

Google Scholar

[13] L. Zelnik-Manor and P. Perona. Self-tuning spectral clustering. In Advances in Neural Information Processing Systems, 2: 1601-1608, (2004).

Google Scholar