Free access
Special Collection Announcements
Sep 17, 2021

Unmanned Aerial Systems (UASs) in the AECO Industry

Publication: Journal of Computing in Civil Engineering
Volume 36, Issue 1
The special collection on Unmanned Aerial Systems (UASs) in the AECO Industry is available in the ASCE Library (https://ascelibrary.org/jccee5/unmanned_aerial_system).
Unmanned aerial systems (UASs) have rapidly permeated architecture, engineering, construction, and operations (AECO) practices over the past several years, and projections indicate the continued growth of their use within these fields beyond the foreseeable future. New generations of UASs also require minimal human involvement, leading to lower jobsite risks. Besides, the advancement in sensors, batteries, and autonomous features has made UAS a much more reliable platform for AECO applications. Today’s commercial UAS can be equipped with various sensors, navigate and collect data autonomously, and transfer the data to a control station in real time. UASs can also be synthesized with many other technologies to enhance AECO practices. The unlimited opportunities present tremendous potential for the engineering and construction community. To accommodate the changes UASs bring to the industry, further studies should be conducted to investigate methodologies, best practices, advantages, and limitations of such disruptive technology in the AECO industry. This special collection aims to advance knowledge in the application of UAS in the civil engineering area by presenting studies for the current integration of UASs in engineering applications and various stages of the construction life cycle.
We received a high number of manuscripts from researchers across the world, and a summary of the accepted articles are provided here. To summarize, several studies used deep learning methods for enhanced surveying- and inspection-related applications. In one study, Jiang et al. (2020) presented a convolutional neural network (CNN) method to identify vegetation objects on a construction site using drone-based orthoimagery to determine the true ground surface elevations from the raw surveying results. In addition to identifying only vegetation or nonvegetation categories, the developed model in this study can automatically identify and locate multiple categories of static objects from the raw surveying results. This study concluded that image-based construction surveying and ground elevation measurement could be facilitated and conducted more accurately using UASs. In another study, Pi et al. (2021) presented a fully annotated dataset of disaster damage features and a host of CNN models to detect and segment such features in the aerial footage of disaster sites. Two CNN architectures for image segmentation, namely Mask-RCNN and PSPNet, were adopted, trained, validated, and tested on annotated videos to detect individual and bulk objects. Altogether, the developed models in the study showed a high degree of robustness in extracting pixel-level segmentations from aerial imagery for damage assessment and quantification in the aftermath of natural disasters. Two of these studies specifically focused on using UAS-based systems to detect and localize surface stresses on bridges and roads. Nasiruddin Khilji et al. (2021) presented a framework for deep neural networks and UAS to detect major distresses on unpaved road surfaces. The results showed promising performance in the segmentation of road surface pixels and defect classification of potholes and washboardings. And in another study, Lin et al. (2021) presented a UAS-based bridge inspection system that created automatic data collection missions, assured visual quality of such missions, reconstructed three-dimensional (3D) models of elevated structures, detected and localized surface distresses in 3D, and finally generated reports complying with highway agencies’ requirements. They deployed their UAS-based system in 30 bridge inspection projects and concluded that their system satisfies data collection requirements and provides up to 85.3% average precision over five defect types.
A framework for inspecting runway design codes for airfields that relies on mosaic imagery is proposed in Kim et al. (2021). Scale-invariant feature transform and best bin first algorithms were integrated to generate accurate UAS-based mosaic imagery for airports. The validation results showed that the framework had a high enough accuracy in measuring pixel-based distances for runway design code items comparable to manual airport inspections. The study contributes to a better understanding of UAS-based airport inspection applications and strengthens and broadens the utility of UASs in visual inspections of civil infrastructure systems. In another study, Martinez et al. (2021) investigated the effect of single- and dual-frequency types of global navigation satellite systems (GNSSs) together with the postprocessing kinematic (PPK) technique on the accuracy of the UAS-generated point cloud data (PCD) in a building surveying application. The outcome of this study provided a better understanding of the effects of various UAS technical configurations and flight parameters on the accuracy of UAS-generated PCD for building surveying applications.
Two studies used augmented and virtual reality technologies within their UAS-based systems. In Liu et al. (2021), an augmented reality (AR) enhanced inspection workflow using videos from a UAV is developed and tested. This innovative integration allows seamless information retrieval from building information modeling (BIM) to augment the aerial video captured by a UAV inspector. A fast coordinate transformation algorithm (F-Trans) is used to convert WGS-84 coordinates collected by UAVs to BIM project coordinates. The efficacy of the AR solution was experimentally validated. The results demonstrate that the precision of F-Trans can achieve submeter precision in matching the aerial video and the BIM. The developed AR inspection solution can facilitate efficient, comprehensive, and unbiased UAV-based building inspections. In another study, Sakib et al. (2021) used immersive VR and physiological data models to understand the effectiveness of drone operation training. More specifically, they focused on the reliability of using the drone operator’s physiological indexes and self-assessments to predict performance, mental workload (MWL), and stress in immersive virtual reality training and outdoor deployment. This study contributes to the core body of knowledge by providing a scalable approach to objectively quantifying performance, MWL, and stress that can be used to design adaptive training systems for drone operators.
Further studies should be conducted on UAS integration for pre-, during-, and postconstruction processes (e.g., project evaluation, site planning, site mapping/surveying, earthmoving, building inspection, aerial construction, material handling, site communication, safety management, security surveillance, progress monitoring, building maintenance, and postdisaster reconnaissance), identifying the advantages, implementation challenges, and comparison with conventional non-UAS processes. Furthermore, UAS successful integration with other technologies and technological concepts (e.g., building information modeling, photo/videogrammetry, virtual/augmented/mixed reality, wearable technologies, big data, 5G technology, smart connected solutions, artificial intelligence, and machine learning predictive analytics) in various stages of the construction life cycle should also be explored. UAS should be considered as a vehicle of different types of sensors [e.g., visual, thermal, radio-frequency identification (RFID), light detection and ranging (LIDAR), motion, air quality, and audio sensors] in the construction industry, and further studies should be conducted to explore the integration of such data capturing sensors with UAS-mediated construction procedures. With the rise in other types of robots (e.g., rovers, crawlers, and autonomous vehicles) in construction and their collaborative work with UASs and humans on jobsites, further studies are required to explore safe and successful integration of ubiquitous, collaborative UASs working with humans and other robots on the construction jobsite. It is also expected to see further integration of UAS with internet-of-things, and additional research needs to be conducted to explore the implementation of internet-of-drones and drone swarms in the construction industry.

References

Jiang, Y., Y. Bai, and S. Han. 2020. “Determining ground elevations covered by vegetation on construction sites using drone-based orthoimage and convolutional neural network.” J. Comput. Civ. Eng. 34 (6): 04020049. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000930.
Kim, S., Y. Gan, and J. Irizarry. 2021. “Framework for UAS-integrated airport runway design code compliance using incremental mosaic imagery.” J. Comput. Civ. Eng. 35 (2): 04020070. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000960.
Lin, J. J., A. Ibrahim, S. Sarwade, and M. Golparvar-Fard. 2021. “Bridge inspection with aerial robots: Automating the entire pipeline of visual data capture, 3D mapping, defect detection, analysis, and reporting.” J. Comput. Civ. Eng. 35 (2): 04020064. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000954.
Liu, D., X. Xia, J. Chen, and S. Li. 2021. “Integrating building information model and augmented reality for drone-based building inspection.” J. Comput. Civ. Eng. 35 (2): 04020073. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000958.
Martinez, J. G., G. Albeaino, M. Gheisari, W. Volkmann, and L. F. Alarcón. 2021. “UAS point cloud accuracy assessment using structure from motion-based photogrammetry and PPK georeferencing technique for building surveying applications.” J. Comput. Civ. Eng. 35 (1): 05020004. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000936.
Nasiruddin Khilji, T., L. Lopes Amaral Loures, and E. Rezazadeh Azar. 2021. “Distress recognition in unpaved roads using unmanned aerial systems and deep learning segmentation.” J. Comput. Civ. Eng. 35 (2): 04020061. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000952.
Pi, Y., N. D. Nath, and A. H. Behzadan. 2021. “Detection and semantic segmentation of disaster damage in UAV footage.” J. Comput. Civ. Eng. 35 (2): 04020063. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000947.
Sakib, M. N., T. Chaspari, and A. H. Behzadan. 2021. “Physiological data models to understand the effectiveness of drone operation training in immersive virtual reality.” J. Comput. Civ. Eng. 35 (1): 040200533. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000941.

Information & Authors

Information

Published In

Go to Journal of Computing in Civil Engineering
Journal of Computing in Civil Engineering
Volume 36Issue 1January 2022

History

Received: Jul 14, 2021
Accepted: Aug 11, 2021
Published online: Sep 17, 2021
Published in print: Jan 1, 2022
Discussion open until: Feb 17, 2022

Permissions

Request permissions for this article.

Authors

Affiliations

Assistant Professor, Rinker School of Construction Management, Univ. of Florida, 322 Rinker Hall, Gainesville, FL 32611-5703 (corresponding author). ORCID: https://orcid.org/0000-0001-5568-9923. Email: [email protected]
Dayana Bastos Costa [email protected]
Associate Professor, Dept. of Structural and Construction Engineering, School of Engineering, Federal Univ. of Bahia, Aristides Novis, 02, Federação, Salvador, BA 40210-630, Brazil. Email: [email protected]
Javier Irizarry, M.ASCE [email protected]
Associate Professor, School of Building Construction, Georgia Institute of Technology, 280 Ferst Dr., Atlanta, GA 30332-0680. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share