Apple Grading Using Principal Component Analysis and Kernel Fisher Discriminant Analysis Combined with NIR Spectroscopy

Article Preview

Abstract:

Principal component analysis (PCA) and kernel Fisher discriminant analysis (KFDA) were applied to grade Fuji apples combined with near infrared reflectance (NIR) spectroscopy. Firstly, PCA was used to reduce the dimensionality of NIR spectra acquired by the Antaris II FT-NIR spectrophotometer on apples. Secondly, nonlinear discriminant information was extracted by kernel Fisher discriminant analysis (KFDA). Finally, the k-nearest neighbors algorithm with leave one out strategy was utilized to classify apple samples into two grades. LDA can only solve linearly separable problems, and it is not suitable in solving some nonlinear problems. But unlike LDA, KFDA can solve nonlinearly separable problems, and it projects data onto a high-dimensional feature space by the nonlinear mapping. Experimental results showed that KFDA achieved higher classification rate compared with LDA.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

529-533

Citation:

Online since:

June 2013

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2013 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] X. Zou and J. Zhao: Pattern Recognition Letters, Vol. 28 (2007), p.2046–2053.

Google Scholar

[2] L. Xu and Y. Zhao: Computers and Electronics in Agriculture, Vol. 71S (2010), p. S32–S39.

Google Scholar

[3] V. Leemans and M.F. Destain: Journal of Food Engineering, Vol. 61 (2004), p.83–89 .

Google Scholar

[4] D. Unaya, B. Gosselinb, O. Kleynen, et al.: Computers and Electronics in Agriculture, Vol. 75 (2011), p.204–212.

Google Scholar

[5] J. Blasco, N. Aleixos and E. Molto: Biosystems Engineering, Vol. 85 (2003), p.415–423.

Google Scholar

[6] D.J. Lee, R. Schoenberger, J. Archibald and S. McCollum: Journal of Food Engineering, Vol. 86 (2008), p.388–398.

Google Scholar

[7] S. Mika, G. Rätsch, J. Weston, B. Schölkopf and K.-R. Müller, in: Neural Networks for Signal Processing IX, edited by Y.-H. Hu, J. Larsen, E. Wilson, and S. Douglas, IEEE, (1999), pp.41-48.

Google Scholar

[8] X.H. Wu and J.J. Zhou: Pattern Recognition, Vol. 39 (2006), pp.2236-2239.

Google Scholar

[9] B. Schölkopf, A.J. Smola, and K.-R. Müller: Neural Computation, Vol. 10 (1998), pp.1299-1319.

Google Scholar

[10] V. N. Vapnik: Statistical Learning Theory (Wiley, 1998).

Google Scholar

[11] J. Shawe-Taylor and N. Cristianini: Support Vector Machines and Other Kernel-based Learning Methods (Cambridge University Press, 2000).

DOI: 10.1017/cbo9780511801389

Google Scholar