Utilization of Open-Source Photogrammetry Software with Structure from Motion Algorithm for Three- Dimensional Modeling

Article Preview

Abstract:

Reconstructing a three-dimensional model of a real-world object requires a quality point cloud to produce an accurate three-dimensional model. Point clouds can be produced by image-based and distance-based methods. The image-based method utilises the light reflected by the object, which is then processed using the principle of photogrammetry. This research is intended to test the image-based method with the help of the structure from motion approach to generate the quality of point clouds using open-source software including Visual SfM, MicMac, Meshroom and with the Agisoft Metashape Pro commercial software. The images are taken by a Xiaomi Yi Action camera with 2.68 mm focal length. The performance comparisons will consider Volume Density, Surface Variation and Point discrepancy. We utilize BLK 360 terrestrial laser scanner to collect reference point cloud. The results show that all software provide convincing result on each parameters compared.

You might also be interested in these eBooks

Info:

Periodical:

Engineering Headway (Volume 27)

Pages:

698-707

Citation:

Online since:

October 2025

Export:

Price:

Permissions CCC:

Permissions PLS:

Сopyright:

© 2025 Trans Tech Publications Ltd. All Rights Reserved

Share:

Citation:

* - Corresponding Author

[1] H. Ledoux et al., "3dfier: automatic reconstruction of 3D city models," J. Open-source Softw., vol. 6, no. 57, 2021.

DOI: 10.21105/joss.02866

Google Scholar

[2] S. Peña-Villasenín, M. Gil-Docampo, and J. Ortiz-Sanz, "Professional SfM and TLS vs a simple SfM photogrammetry for 3D modelling of rock art and radiance scaling shading in engraving detection," J. Cult. Herit., vol. 37, no. November, p.238–246, 2019.

DOI: 10.1016/j.culher.2018.10.009

Google Scholar

[3] L. Jurjević and M. Gašparović, "3D data acquisition based on opencv for close-range photogrammetry applications," Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. - ISPRS Arch., vol. 42, no. 1W1, p.377– 382, 2017.

DOI: 10.5194/isprs-archives-XLII-1-W1-377-2017

Google Scholar

[4] G. Vacca, "Overview of Open-Source Software For Close Range Photogrammetry," in International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 2019, vol. 42, no. 4/W14.

DOI: 10.5194/isprs-archives-XLII-4-W14-239-2019

Google Scholar

[5] M. J. Westoby, J. Brasington, N. F. Glasser, M. J. Hambrey, and J. M. Reynolds, "'Structure-from-Motion' photogrammetry: A low-cost, effective tool for geoscience applications," Geomorphology, vol. 179, no. March 2014, p.300–314, 2012.

DOI: 10.1016/j.geomorph.2012.08.021

Google Scholar

[6] A. Murtiyoso, P. Grussenmeyer, N. Börlin, J. Vandermeerschen, and T. Freville, "Open-source and independent methods for bundle adjustment assessment in close-range UAV photogrammetry," Drones, vol. 2, no. 1, 2018.

DOI: 10.3390/drones2010003

Google Scholar

[7] M. I. Firdaus and J. Y. Rau, "Comparisons of the three-dimensional model reconstructed using MicMac, PIX4D mapper and Photoscan Pro," 38th Asian Conf. Remote Sens. - Sp. Appl. Touching Hum. Lives, ACRS 2017, vol. 2017-Octob, no. October 2017, 2017.

Google Scholar

[8] E. Rupnik, M. Daakir, and M. P. Deseilligny, "Open Geospatial Data, Software and Standards MicMac-a free, open-source solution for photogrammetry," Open Geospatial Data, Softw. Stand., vol. 2, 2017.

DOI: 10.1186/s40965-017-0027-2

Google Scholar

[9] J. Morgan and D. Brogan, How to VisualSFM, no. January. 2016.

Google Scholar

[10] I. Đurić, I. Vasiljević, M. Obradović, V. Stojaković, J. Kićanović, and R. Obradović, "Comparative Analysis of Open-Source and Commercial Photogrammetry Software for Cultural Heritage," Proc. Int. Conf. Educ. Res. Comput. Aided Archit. Des. Eur., vol. 2, no. August, p.243–252, 2021.

DOI: 10.52842/conf.ecaade.2021.2.243

Google Scholar

[11] D. Girardeau-Montaut, "CloudCompare version 2.6.1. user manual," p.181, 2015, [Online]. Available:http://www.danielgm.net/cc/

Google Scholar