Application of Machine Vision Technology in the Detection of Industrial Products

Article Preview

Abstract:

This paper introduces several applications of machine vision technology. With paneled CCD as the image sensor, the two-dimensional pictures of sample under study are inputted to a computer via the image grabbing card. After the preprocessing to the original image, image segmentation and feature extraction are carried on the target image, and the parameter value is figured out. The method is suitable for the parameter detection of the size, surface defect and color information, and have vast of application foreground.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

366-369

Citation:

Online since:

April 2012

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2012 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

[1] Hwang Y. -S., Bang S. -Y. Recognition of unconstrained handwritten numerals by a radial basis function neural netwaork classifier. Pattern Recognition Letters 18, 1997: 657~664.

DOI: 10.1016/s0167-8655(97)00056-1

Google Scholar

[2] J.G. Sled,B. Zijdenbos and A.C. Evans, A nonparametric method for automatic correction of intensity nonuniformity in MRI data. IEEE Trans. Medical Imaging. 1998, (17): 89~97.

DOI: 10.1109/42.668698

Google Scholar

[3] S. Shen W.A. Sandham M.H. Granat, Intensity Non-uniformity Correction of Magnetic Resonance Images Using a Fuzzy Segmentation Algorithm. IEEE Engineering in Medicine and Biology 27th Annual Conference Shanghai. China. September1-4, 2005: 3035~3038.

DOI: 10.1109/iembs.2005.1617114

Google Scholar

[4] LI Wen wen , WANG Bao guang,ZHANG Yan. Study on the online classifying system for ceramic tiles' color difference[J]. Transducer and Microsystem Technologics. 2010, 29(3): 13-15.

Google Scholar

[5] C Boukouvalas, J Kitler, R Marik, M Petrou Automatic Grading of Ceramic Tiles Using Machine Vision. IEEE Transactions on Industrial Electronics, vol. 44, no. 1, February 1997, 55~63.

DOI: 10.1109/41.557508

Google Scholar