The Effects of UAV-Captured Image Degradation Issues on the Quality of 3D Reconstruction
Publication: Construction Research Congress 2022
ABSTRACT
Image-based 3D reconstruction is a prerequisite technology, prior to establishing the as-built spatial data of the built environment for many purposes, including progress monitoring of construction sites, quality control, energy performance assessment, and structural integrity evaluation. Unmanned aerial vehicles (UAVs) are an emerging technology that allows for effective remote image acquisition through the integration of digital cameras. It has been widely applied to data surveying in various fields due to its lower cost, ease of use, and flexible access to hard-to-reach locations. While UAVs provide great potential in efficiently capturing high-quality images for 3D reconstruction, the spatial data collection process is prone to image degradation issues. The external environmental factors like wind that may lead to instability of the UAV platform, and inappropriate camera settings can cause degradation in the quality of captured images such as motion blur, overexposure, and underexposure in light. In this paper, the influence of image degradation issues on image-based 3D reconstruction is explored. Four types of commonly occurring degradation sources including motion blur, out-of-focus blur, as well as underexposure and overexposure to light are studied. The image degradation effects were artificially generated on a real scenario and the quality of 3D reconstruction was evaluated. In the experiments conducted, the intensity of degradation and percentage of degraded images were varied, for each source of degradation, to understand the effect on reconstruction quality. The results indicated motion blurred images led to significant loss in semantic information which posed the greatest impact on 3D reconstruction quality.
Get full access to this article
View all available purchase options and get full access to this chapter.
REFERENCES
Afifi, M., Derpanis, K. G., Ommer, B., and Brown, M. S. (2020). “Learning Multi-Scale Photo Exposure Correction.”.
Barba, S., Barbarella, M., Di Benedetto, A., Fiani, M., Gujski, L., and Limongiello, M. (2019). “Accuracy Assessment of 3D Photogrammetric Models from an Unmanned Aerial Vehicle.” Drones, 3(4).
CyArk. (2020). “CyArk 2020: Chunakhola - Mosque City of Bagerhat - LiDAR - Terrestrial, Photogrammetry - Terrestrial, Photogrammetry - Aerial, LSP Files.” Open Heritage 3D, <https://doi.org/10.26301/74v4-k412>.
Daftry, S., Hoppe, C., and Bischof, H. (2015). “Building with drones: Accurate 3D facade reconstruction using MAVs.” 2015 IEEE International Conference on Robotics and Automation (ICRA), 3487–3494.
Furukawa, Y., and Ponce, J. (2010). “Accurate, Dense, and Robust Multiview Stereopsis.” IEEE Transactions on Pattern Analysis and Machine Intelligence, 32(8), 1362–1376.
Kupyn, O., Budzan, V., Mykhailych, M., Mishkin, D., and Matas, J. (2018). “DeblurGAN: Blind Motion Deblurring Using Conditional Adversarial Networks.” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, 8183–8192.
Li, D., Jiang, T., and Jiang, M. (2018). “Exploiting High-Level Semantics for No-Reference Image Quality Assessment of Realistic Blur Images.” Proceedings of the 2017 ACM on Multimedia Conference.
Misra, S., and Wu, Y. (2020). “Chapter 10 - Machine learning assisted segmentation of scanning electron microscopy images of organic-rich shales with feature extraction and feature ranking.” Machine Learning for Subsurface Characterization, S. Misra, H. Li, and J. He, eds., Gulf Professional Publishing, 289–314.
Murtiyoso, A., and Grussenmeyer, P. (2017). “Documentation of heritage buildings using close-range UAV images: dense matching issues, comparison and case studies.” The Photogrammetric Record, 32(159), 206–229.
Rumpler, M., Tscharf, A., Mostegel, C., Daftry, S., Hoppe, C., Prettenthaler, R., Fraundorfer, F., Mayer, G., and Bischof, H. (2017). “Evaluations on multi-scale camera networks for precise and geo-accurate reconstructions from aerial and terrestrial images with user guidance.” Computer Vision and Image Understanding, 157, 255–273.
Sekrecka, A., Wierzbicki, D., and Kedzierski, M. (2020). “Influence of the Sun Position and Platform Orientation on the Quality of Imagery Obtained from Unmanned Aerial Vehicles.” Remote Sensing, 12(6).
Sieberth, T., Wackrow, R., and Chandler, J. H. (2013). “Automatic isolation of blurred images from from UAV image sequences.” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-1/W2, 361–366.
Sieberth, T., Wackrow, R., and Chandler, J. H. (2015). “UAV Image Blur - It’s influence and ways to correct it.” The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-1/W4, 33–39.
Slocum, R. K., and Parrish, C. E. (2017). “Simulated Imagery Rendering Workflow for UAS-Based Photogrammetric 3D Reconstruction Accuracy Assessments.” Remote Sensing, 9(4).
Truong Giang, N., Muller, J.-M., Rupnik, E., Thom, C., and Pierrot-Deseilligny, M. (2018). “Second Iteration of Photogrammetric Processing to Refine Image Orientation with Improved Tie-Points †.” Sensors, 18(7).
Xia, G.-S., Datcu, M., Yang, W., and Bai, X. (2018). “Information processing for unmanned aerial vehicles (UAVs) in surveying, mapping, and navigation.” Geo-spatial Information Science, Taylor & Francis, 21(1), 1.
Xu, G., Liu, C., and Ji, H. (2018). “Removing out-of-focus blur from a single image.”.
Wang, Z., Bovik, A. C., Sheikh, H. R., and Simoncelli, E. P. (2004). “Image quality assessment: from error visibility to structural similarity.” IEEE Transactions on Image Processing, 13(4), 600–612.
Information & Authors
Information
Published In
History
Published online: Mar 7, 2022
Authors
Metrics & Citations
Metrics
Citations
Download citation
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.