Technical Papers
Jan 28, 2022

Efficient Approach for Autonomous Facility Inspection Using UAV Images

Publication: Journal of Infrastructure Systems
Volume 28, Issue 2

Abstract

Civil infrastructure systems, such as harbors and airports, support transportation between international communities. Inspection of facilities in a large area is labor-intensive and time-consuming work, and the result can sometimes be inaccurate due to humans’ limitations and inexperience. In this study, an autonomous aerial platform with a functional multipoint patrol module is developed to acquire images of inspected targets. It integrates a high-definition digital surface model (DSM) and a route-searching algorithm to optimize flying route planning. The collected unmanned aerial vehicle (UAV) images are then subjected to a matching technique that automatically detects exposure positions and corrects image distortions. Finally, target facilities are extracted from multitemporal images using object detection techniques so that the status of the inspected targets can be tracked and evaluated. Case studies in real-world settings have been conducted. As this proposed approach refines the efficiency and reliability of the facility inspection task, significant improvements are expected to deal with wide area monitoring and flexibility management.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

All data generated or used during the study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors would like to thank the anonymous reviewers for their constructive comments, which helped to improve the quality of the original manuscript. The authors would also like to express their gratitude to the Harbor and Marine Technology Center in Taiwan for funding this study under Grant MOTC-IOT-110-H2CB001j.

References

Adarsh, P., P. Rathi, and M. Kumar. 2020. “YOLO v3-Tiny: Object detection and recognition using one stage improved model.” In Proc., 2020 6th Int. Conf. on Advanced Computing and Communication Systems (ICACCS), 687–694. New York: IEEE. https://doi.org/10.1109/ICACCS48705.2020.9074315.
Alcantarilla, P. F., J. Nuevo, and A. Bartoli. 2013. “Fast explicit diffusion for accelerated features in nonlinear scale spaces.” Proc. British Machine Vision Conf. 2013 (10): 1–11. https://doi.org/10.5244/C.27.13.
Ammar, A., A. Koubaa, M. Ahmed, and A. Saad. 2019. “Aerial images processing for car detection using convolutional neural networks: Comparison between faster R-CNN and yolov3.” Preprints, submitted October 26, 2019. http://arxiv.org/abs/1910.07234.
Arce, G. O., and M. I. McLoughlin. 1987. “Theoretical analysis of the max/median filter.” IEEE Trans. Signal Process. 35 (1): 60–69. https://doi.org/10.1109/TASSP.1987.1165036.
Bay, H., A. Ess, T. Tuytelaars, and L. Van Gool. 2008. “Speeded-up robust features (SURF).” Comput. Vision Image Understanding 110 (3): 346–359. https://doi.org/10.1016/j.cviu.2007.09.014.
Bay, H., T. Tuytelaars, and L. Van Gool. 2006. “Surf: Speeded up robust features.” In Proc., European Conf. on Computer Vision, 404–417. Berlin: Springer.
Bochkovskiy, A., C. Y. Wang, and H. Y. M. Liao. 2020. “Yolov4: Optimal speed and accuracy of object detection.” Preprints, submitted April 23, 2019. http://arxiv.org/abs/2004.10934.
Castillo-Carrión, S., and J. E. Guerrero-Ginel. 2017. “SIFT optimization and automation for matching images from multiple temporal sources.” Int. J. Appl. Earth Obs. Geoinf. 57 (May): 113–122. https://doi.org/10.1016/j.jag.2016.12.017.
Dhulkefl, E. J., and A. Durdu. 2019. “Path planning algorithms for unmanned aerial vehicles.” Int. J. Trend Sci. Res. Dev. 3 (4): 359–362. https://doi.org/10.31142/ijtsrd23696.
Dick, K., L. Russell, Y. Souley Dosso, F. Kwamena, and J. R. Green. 2019. “Deep learning for critical infrastructure resilience.” J. Infrastruct. Syst. 25 (2): 05019003. https://doi.org/10.1061/(ASCE)IS.1943-555X.0000477.
Dijkstra, E. W. 1959. “A note on two problems in connexion with graphs.” Numer. Math. 1 (1): 269–271. https://doi.org/10.1007/BF01386390.
Dufournaud, Y., C. Schmid, and R. Horaud. 2004. “Image matching with scale adjustment.” Comput. Vision Image Understanding 93 (2): 175–194. https://doi.org/10.1016/j.cviu.2003.07.003.
Faugeras, O. D., and K. E. Price. 1981. “Semantic description of aerial images using stochastic labeling.” IEEE Trans. Pattern Anal. Mach. Intell. 3 (6): 633–642. https://doi.org/10.1109/TPAMI.1981.4767164.
Fouche, G. J., and R. Malekian. 2018. “Drone as an autonomous aerial sensor system for motion planning.” Measurement 119 (Apr): 142–155. https://doi.org/10.1016/j.measurement.2018.01.027.
Girshick, R., J. Donahue, T. Darrell, and J. Malik. 2014. “Rich feature hierarchies for accurate object detection and semantic segmentation.” In Proc., 2014 IEEE Conf. on Computer Vision and Pattern Recognition, 580–587. Columbus, OH: IEEE. https://doi.org/10.1109/CVPR.2014.81.
Gonzalez, R. C., and R. E. Woods. 2002. Digital image processing. 1st ed. New York: Pearson.
Gross, J. W., and B. W. Heumann. 2016. “A statistical examination of image stitching software packages for use with unmanned aerial systems.” Photogramm. Eng. Remote Sens. 82 (6): 419–425. https://doi.org/10.14358/PERS.82.6.419.
Hackl, J., B. T. Adey, M. Woźniak, and O. Schümperlin. 2018. “Use of unmanned aerial vehicle photogrammetry to obtain topographical information to improve bridge risk assessment.” J. Infrastruct. Syst. 24 (1): 04017041. https://doi.org/10.1061/(ASCE)IS.1943-555X.0000393.
Huang, L., C. Chen, H. Shen, and B. He. 2015. “Adaptive registration algorithm of color images based on SURF.” Measurement 66 (Apr): 118–124. https://doi.org/10.1016/j.measurement.2015.01.011.
Jaud, M., P. Letortu, C. Théry, P. Grandjean, S. Costa, O. Maquaire, R. Davidson, and N. Le Dantec. 2019. “UAV survey of a coastal cliff face–Selection of the best imaging angle.” Measurement 139 (Jun): 10–20. https://doi.org/10.1016/j.measurement.2019.02.024.
Jeauneau, V., L. Jouanneau, and A. Kotenkoff. 2018. “Path planner methods for UAVs in real environment.” IFAC-PapersOnLine 51 (22): 292–297. https://doi.org/10.1016/j.ifacol.2018.11.557.
Juan, L., and O. Gwun. 2009. “A comparison of sift, pca-sift and surf.” Int. J. Image Process. Appl. 3 (4): 143–152.
Kim, J. A., J. Y. Sung, and S. H. Park. 2020. “Comparison of Faster-RCNN, YOLO, and SSD for real-time vehicle type recognition.” In Proc., 2020 IEEE Int. Conf. on Consumer Electronics-Asia (ICCE-Asia), 1–4. Busan, South Korea: IEEE. https://doi.org/10.1109/ICCE-Asia49877.2020.9277040.
Krizhevsky, A., I. Sutskever, and G. E. Hinton. 2017. “Image net classification with deep convolutional neural networks.” Commun. ACM 60 (6): 84–90. https://doi.org/10.1145/3065386.
Lattanzi, D., and G. Miller. 2017. “Review of robotic infrastructure inspection systems.” J. Infrastruct. Syst. 23 (3): 04017004. https://doi.org/10.1061/(ASCE)IS.1943-555X.0000353.
Lattanzi, D., and G. R. Miller. 2015. “3D scene reconstruction for robotic bridge inspection.” J. Infrastruct. Syst. 21 (2): 04014041. https://doi.org/10.1061/(ASCE)IS.1943-555X.0000229.
Li, M., Z. Zhang, L. Lei, X. Wang, and X. Guo. 2020. “Agricultural greenhouses detection in high-resolution satellite images based on convolutional neural networks: Comparison of faster R-CNN, YOLO v3 and SSD.” Sensors 20 (17): 4938. https://doi.org/10.3390/s20174938.
Liu, W., D. Anguelov, D. Erhan, C. Szegedy, S. Reed, C. Y. Fu, and A. C. Berg. 2016. “SSD: Single shot multibox detector.” In Proc., European Conf. on Computer Vision, 21–37. Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-46448-0_2.
Lowe, D. G. 2004. “Distinctive image features from scale-invariant keypoints.” Int. J. Comput. Vision 60 (2): 91–110. https://doi.org/10.1023/B:VISI.0000029664.99615.94.
Menouar, H., I. Guvenc, K. Akkaya, A. S. Uluagac, A. Kadri, and A. Tuncer. 2017. “UAV-enabled intelligent transportation systems for the smart city: Applications and challenges.” IEEE Commun. Mag. 55 (3): 22–28. https://doi.org/10.1109/MCOM.2017.1600238CM.
Misra, D. 2019. “Mish: A self regularized non-monotonic activation function.” Preprints, submitted August 23, 2019. http://arxiv.org/abs/1908.08681.
Pearl, J. 1984. Heuristics: Intelligent search strategies for computer problem solving. Boston: Addison-Wesley.
Rathinam, S., Z. W. Kim, and R. Sengupta. 2008. “Vision-based monitoring of locally linear structures using an unmanned aerial vehicle.” J. Infrastruct. Syst. 14 (1): 52–63. https://doi.org/10.1061/(ASCE)1076-0342(2008)14:1(52).
Redmon, J., S. Divvala, R. Girshick, and A. Farhadi. 2016. “You only look once: Unified, real-time object detection.” In Proc., 2016 IEEE Conf. on Computer Vision and Pattern Recognition, 779–788. Las Vegas, NV: IEEE.
Riaz, M., G. Kang, Y. Kim, S. Pan, and J. Park. 2008. “Efficient image retrieval using adaptive segmentation of HSV color space.” In Proc., 2008 Int. Conf. on Computational Sciences and Its Applications, 491–496. Perugia, Italy: IEEE. https://doi.org/10.1109/ICCSA.2008.55.
Rosten, E., R. Porter, and T. Drummond. 2008. “Faster and better: A machine learning approach to corner detection.” IEEE Trans. Pattern Anal. Mach. Intell. 32 (1): 105–119. https://doi.org/10.1109/TPAMI.2008.275.
Rusnák, M., J. Sládek, A. Kidová, and M. Lehotský. 2018. “Template for high-resolution river landscape mapping using UAV technology.” Measurement 115 (Feb): 139–151. https://doi.org/10.1016/j.measurement.2017.10.023.
Saad, A. M., and K. N. Tahar. 2019. “Identification of rut and pothole by using multirotor unmanned aerial vehicle (UAV).” Measurement 137 (Apr): 647–654. https://doi.org/10.1016/j.measurement.2019.01.093.
Schenk, T. 1999. Digital photogrammetry. 1st ed. Laurelville, OH: TerraScience.
Yang, Z. 2010. “Fast template matching based on normalized cross correlation with centroid bounding.” In Vol. 2 of Proc., 2010 Int. Conf. on Measuring Technology and Mechatronics Automation, 224–227. Changsha, China: IEEE. https://doi.org/10.1109/ICMTMA.2010.419.
Ye, Y., L. Shen, M. Hao, J. Wang, and Z. Xu. 2017. “Robust optical-to-SAR image matching based on shape properties.” IEEE Geosci. Remote Sens. Lett. 14 (4): 564–568. https://doi.org/10.1109/LGRS.2017.2660067.
Yeh, C. C., Y. L. Chang, M. Alkhaleefah, P. H. Hsu, W. Eng, V. C. Koo, B. Huang, and L. Chang. 2021. “YOLOv3-based matching approach for roof region detection from drone images.” Remote Sens. 13 (1): 127. https://doi.org/10.3390/rs13010127.
Zhao, J., X. Zhang, C. Gao, X. Qiu, Y. Tian, Y. Zhu, and W. Cao. 2019. “Rapid mosaicking of unmanned aerial vehicle (UAV) images for crop growth monitoring using the SIFT algorithm.” Remote Sens. 11 (10): 1226. https://doi.org/10.3390/rs11101226.
Zhuo, X., T. Koch, F. Kurz, F. Fraundorfer, and P. Reinartz. 2017. “Automatic UAV image geo-registration by matching UAV images to georeferenced image data.” Remote Sens. 9 (4): 376. https://doi.org/10.3390/rs9040376.

Information & Authors

Information

Published In

Go to Journal of Infrastructure Systems
Journal of Infrastructure Systems
Volume 28Issue 2June 2022

History

Received: Sep 14, 2021
Accepted: Dec 13, 2021
Published online: Jan 28, 2022
Published in print: Jun 1, 2022
Discussion open until: Jun 28, 2022

Permissions

Request permissions for this article.

Authors

Affiliations

Project Assistant Researcher, National Center for Research on Earthquake Engineering, National Applied Research Laboratories, No. 200, Sec. 3, Xinhai Rd., Taipei 106219, Taiwan. ORCID: https://orcid.org/0000-0002-9893-5095
Yi-Hsuan Kan
Research Associate, Dept. of Civil Engineering, National Taiwan Univ., No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan.
Professor, Dept. of Civil Engineering, National Taiwan Univ., No. 1, Sec. 4, Roosevelt Rd., Taipei 10617, Taiwan (corresponding author). ORCID: https://orcid.org/0000-0001-9555-4214. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share