Shape Measurement Based on 3D Optical Scanner: Real Case Study in the Aeronautics Industry

Article Preview

Abstract:

Three dimensional (3-D) geometric shape measurements have found wide applications in the fields of industrial manufacturing, fast reverse engineering, quality control, biomedical sciences, etc. In the present paper we focus our attention on Reverse engineering. It starts from an existing product, and creates a CAD (Computer Aided Design) model, for modification or reproduction of its design. This kind of process is usually undertaken in order to redesign a system for better maintainability or to obtain a copy of a system without access to design from which it was originally produced. There are many different methods for acquiring shape data. Tactile methods represent a popular approach to shape capture. The most commonly known forms are Coordinate Measuring Machines (CMMs) and mechanical or robotic arms with a touch probe sensing device. Non-contact methods use light, sound or magnetic fields to acquire shape from objects. In the case of contact and non-contact, an appropriate analysis must be performed to determine the positions of points on objects surface. The aim of this paper is to present a new reconstruction method of three-dimensional measuring system for objects acquisition: reverse engineering based on structured light system. This technique consists in projecting a known pattern of pixels (structured light) on an object: the way in which this pattern is deformed encountering the object surfaces, allows vision systems (a couple of monochromatic digital cameras) to calculate depth information necessary for surfaces digitization. The subsequent application of this methodology has been to reengineer an aeronautical component that has changed over time and divergent from initial project. At last have been studied proposals and possible solutions that ensure an higher quality of manufactured products and a substantial savings in costs of production system of the product itself.

You might also be interested in these eBooks

Info:

Periodical:

Pages:

378-387

Citation:

Online since:

October 2014

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2014 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] M. Affenzeller, M. Massei, F. De Felice, D. Del Rio Vilas, General co-chairs' message. 12th International Conference on Modeling and Applied Simulation. Athens; Greece; 25 -27 September (2013).

Google Scholar

[2] B. Bidanda, K. Harding, Reverse engineering: an evaluation of prospective non-contact technologies and applications in manufacturing systems, Int J ComputIntegrManuf 30(10): 791-805 (1991).

Google Scholar

[3] T. Varady, M. Coxt, Reverse engineering of geometric models – an introduction, Computer Aided Design, (1997).

Google Scholar

[4] F. De Felice, A. Petrillo, Productivity analysis through simulation technique to optimize an automated assembly line. Proceedings of the IASTED International ConferenceApplied Simulation and Modelling (ASM 2012), pp.35-42.

DOI: 10.2316/p.2012.776-048

Google Scholar

[5] S.C. Park, M. Chang, Reverse engineering with structured light system, Elsevier Science (2009).

Google Scholar

[6] K. Saito, T. Miyoshi, Non-contact 3-D digitizing and machining system for free form surface, Annals of CIRP 483-486 (1991).

DOI: 10.1016/s0007-8506(07)62035-6

Google Scholar

[7] S. Barone, A. Curcio, A. Razionale, A Structured Light Stereo System for Reverse Engineering, In Template'06, 1st International Conference on Template Production, SciTePress (2003).

Google Scholar

[8] Y.C. Hsieh, A note on the structured light patterns for three-dimensional imaging systems, Pattern Recognition 315-318 (1998).

DOI: 10.1016/s0167-8655(97)00159-1

Google Scholar

[9] J. Salvi, J. Pages, J. Batle, Pattern codification strategies in structured light systems, Pattern Recognition 827-849 (2004).

DOI: 10.1016/j.patcog.2003.10.002

Google Scholar

[10] K.C. Fan, A non-contact automatic measurement for free form surface profile, Comput. Int. Manuf. Syst. 10(4): 277-285 (1997).

Google Scholar

[11] R. Legarda-Sáenz, T. Bothe, W.P. Jüptner, Accurate procedure for the calibration of a structured light system, Opt Eng 464-471 (2004).

DOI: 10.1117/1.1635373

Google Scholar

[12] G. Zhang, Z. Wei, X. Li, Vision Inspection Based on Structured Light, Journal of Manufacturing and Engineering (2003).

Google Scholar

[13] E. Horn, N. Kiryati, Toward optimal structured light patterns, Image and Vision Computing 17(5) 87-97 (1999).

DOI: 10.1016/s0262-8856(98)00113-9

Google Scholar

[14] M.A. Tehrani, A. Saghaeian, O.R. Mohajerani, A new approach to 3D modeling using structured light pattern, Third international conferenceon information and communication technologies: fromtheory to applications, Madison, WI, 1-5 (2008).

DOI: 10.1109/ictta.2008.4530071

Google Scholar

[15] C.S. Chen, Y.P. Hung, C.C. Chiang J.L. Wu, Range data acquisition using color structured lighting and stereo vision, Image and Vision Computing, 445-456 (1997).

DOI: 10.1016/s0262-8856(96)01148-1

Google Scholar

[16] H.B. Wu, Y. Chen, M.Y. Wu, C.R. Guan, X.Y. Yu, 3D Measurement technology by structured light using stripe-edge-based gray code, J. Phys: ConfSer 2006, 537-541 (2006).

DOI: 10.1088/1742-6596/48/1/101

Google Scholar

[17] S. Zhang, Z. Wei X. Li, Vision Inspection Based on Structured Light, Journal of Manufacturing and Engineering (2010).

Google Scholar

[18] R.J. Valkenburg, A.M. Mcivor, Accurate 3D measurement using a structured light system, Image and Vision Computing, 99-110 (1998).

DOI: 10.1016/s0262-8856(97)00053-x

Google Scholar

[19] D. Zou, S. Ye, C. Wang, Structured-lighting surface sensor and its calibration, Opt Eng 1995, 34(10): 3043-3 (1995).

DOI: 10.1117/12.210742

Google Scholar

[20] E. Mouaddib, J. Batle, J. Salvi, Recent progress in structured light in order to solve the correspondence problem in stereo vision, Proceedings of the 1997 IEEE International Conference on Robotics and Automation, 130-6 (1997).

DOI: 10.1109/robot.1997.620027

Google Scholar