Technical Papers
Sep 11, 2023

Automatic Detection and Classification of Underground Objects in Ground Penetrating Radar Images Using Machine Learning

Publication: Journal of Pipeline Systems Engineering and Practice
Volume 14, Issue 4

Abstract

Ground penetrating radar (GPR) is widely used in subsurface utility mapping. It is a nondestructive tool that has gained popularity in supporting underground drilling projects such as horizontal directional drilling (HDD). Even with the benefits including equipment portability, low cost, and high versatility in locating underground objects, GPR has a drawback of the time spent and expertise needed in data interpretation. Recent researchers have shown success in utilizing machine learning (ML) algorithms in GPR images for the automatic detection of underground objects. However, due to the lack of availability of labeled GPR datasets, most of these algorithms used synthetic data. This study presents the application of the state-of-the-art You Only Look Once (YOLO) v5 algorithm to detect underground objects using GPR images. A GPR dataset was prepared by collecting GPR images in a laboratory setup. For this purpose, a commercially available 2GHz high-frequency GPR antenna was used, and a dataset was collected with images of metal and PVC pipes, air and water voids, and boulders. The YOLOv5 algorithm was trained with a dataset that successfully detected and classified underground objects to their respective classes.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Some or all data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

The authors would like to thank Lana Gutwin of the Canadian Underground Infrastructure Innovation Centre (CUIIC) at the University of Alberta for her assistance with editing this manuscript. Funding from the Natural Sciences and Engineering Research Council of Canada (NSERC) is gratefully acknowledged. This research article is derived from a master’s thesis published by the University of Alberta Libraries (Amaral 2021).

References

Al-Nuaimy, W., Y. Huang, A. Eriksen, and V. T. Nguyen. 2001. “Automatic detection of hyperbolic signatures in ground-penetrating radar data.” In Subsurface and surface sensing technologies and applications III, 327–335. Bellingham, WA: International Society for Optics and Photonics.
Alzubi, J., A. Nayyar, and A. Kumar. 2018. “Machine learning from theory to algorithms: An overview.” J. Phys. Conf. Ser. 1142 (Mar): 012012. https://doi.org/10.1088/1742-6596/1142/1/012012.
Amaral, L. C. M. 2021. Automatic detection of underground objects in ground penetrating radar images using machine learning. Edmonton, AB, Canada: Univ. of Alberta Libraries.
Amaral, L. C. M., A. Roshan, and A. Bayat. 2021. “Review of machine learning algorithms for automatic detection of underground objects in GPR images.” J. Pipeline Syst. Eng. Pract. 13 (2): 04021082. https://doi.org/10.1061/(ASCE)PS.1949-1204.0000632.
Arcand, L., P. Eng, and H. Osman. 2006. “Utilization of subsurface utility engineering to improve the effectiveness of utility relocation and coordination efforts on highway projects in Ontario.” In Proc., Annual Conf. of the Transportation Association of Canada, Charlottetown, 17–20. Ottawa: Transportation Association of Canada.
Ardekani, S. 2006. “Automatic and fast detection of buried utilities positions and estimation of soil permittivity using GPR.” In Proc., 11th Int. Conf. on Ground Penetrating Radar. Columbus, OH: Ohio State Univ.
Benedetto, A., F. Tosti, L. B. Ciampoli, and F. D’Amico. 2016. “An overview of ground-penetrating radar signal processing techniques for road inspections.” Signal Process. 132 (Mar): 201–209. https://doi.org/10.1016/j.sigpro.2016.05.016.
Bettin, G., G. Bromhal, M. Brudzinski, A. Cohen, G. Guthrie, P. Johnson, L. Matthews, S. Mishra, and D. Vikara. 2019. “Real-time decision-making for the subsurface report.” National Energy Technology Laboratory. Accessed July 7, 2023. https://www.cmu.edu/energy/education-outreach/policymaker-outreach/documents/real-time-decision-making-for-the-subsurface-report.pdf.
Bianchini Ciampoli, L., F. Tosti, N. Economou, and F. Benedetto. 2019. “Signal processing of GPR data for road surveys.” Geosciences 9 (2): 96. https://doi.org/10.3390/geosciences9020096.
Bochkovskiy, A., C. Y. Wang, and H. Y. M. Liao. 2020. “Yolov4: Optimal speed and accuracy of object detection.” Preprint, submitted April 23, 2020. http://arxiv.org/abs/2004.10934.
Capineri, L., P. Grande, and J. A. G. Temple. 1998. “Advanced image-processing technique for real-time interpretation of ground-penetrating radar images.” Int. J. Imaging Syst. Technol. 9 (1): 51–59. https://doi.org/10.1002/(SICI)1098-1098(1998)9:1%3C51::AID-IMA7%3E3.0.CO;2-Q.
Cazzaniga, N. E., D. Carrion, F. Migliaccio, and R. Barzaghi. 2013. “A shared database of underground utility lines for 3D mapping and GIS applications.” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. XL-4/W1 (May): 105–108. https://doi.org/10.1002/(SICI)1098-1098(1998)9:1%3C51::AID-IMA7%3E3.0.CO;2-Q.
Chu, Y., Y. Kaiheng, M. Cheng, H. Qin, Hongliang Yiming, and Y. Lin. 2022. “YOLOv6: A fast and accurate target detection framework is open source.” Accessed July 7, 2023. https://tech.meituan.com/2022/06/23/yolov6-a-fast-and-accurate-target-detection-framework-is-opening-source.html.
COCO (Common Objects in Context). n.d. “Cocodataset.org.” Accessed February 21, 2015. https://cocodataset.org/.
Deng, J., W. Dong, R. Socher, L. Li, K. Li, and L. Fei-fei. 2009. “Imagenet: A large-scale hierarchical image database.” In Proc., 2009 IEEE Conf. on Computer Vision and Pattern Recognition, 248–255. New York: IEEE.
Dérobert, X., and L. Pajewski. 2018. “TU1208 open database of radargrams: The dataset of the IFSTTAR Geophysical Test Site.” Remote Sens. 10 (4): 530. https://doi.org/10.3390/rs10040530.
Dhillon, A., and G. K. Verma. 2020. “Convolutional neural network: A review of models, methodologies and applications to object detection.” Prog. Artif. Intell. 9 (2): 85–112. https://doi.org/10.1007/s13748-019-00203-0.
Du, J. 2018. “Understanding of object detection based on CNN family and YOLO.” J. Phys. Conf. Ser. 1004 (Mar): 012029. https://doi.org/10.1088/1742-6596/1004/1/012029.
Everingham, M., L. Van Gool, C. K. I. Williams, J. Winn, and A. Zisserman. 2010. “The Pascal visual object classes (VOC) challenge.” Int. J. Comput. Vis. 88 (2): 303–338. https://doi.org/10.1007/s11263-009-0275-4.
Girshick, R. 2015. “Fast R-CNN.” In Proc., IEEE Int. Conf. on Computer Vision, 1440–1448. New York: IEEE.
Gong, Z., and H. Zhang. 2020. “Research on GPR image recognition based on deep learning.” In Vol. 309 of Proc., MATEC Web of Conf., Les Ulis, France: EDP Sciences.
GSSI (Geophysical Survey Systems). 2016. Utility locating handbook, 37. Nashua, NH: GSSI.
Huang, Z., F. Li, X. Luan, and Z. Cai. 2020. “A weakly supervised method for mud detection in ores based on deep active learning.” In Mathematical problems in engineering, e3510313. London: Hindawi.
Jocher, G. 2020. “Ultralutics YOLOv5.” Accessed July 7, 2023. https://github.com/ultralytics/yolov5.
Kim, N., K. Kim, Y.-K. An, H.-J. Lee, and J. J. Lee. 2018. “Deep learning-based underground object detection for urban road pavement.” Int. J. Pavement Eng. 21 (13): 1638–1650. https://doi.org/10.1080/10298436.2018.1559317.
Lin, T.-Y., M. Maire, S. Belongie, J. Hays, P. Perona, D. Ramanan, P. Dollár, and C. L. Zitnick. 2014. “Microsoft COCO: Common objects in context.” In Computer vision–ECCV 2014, edited by D. Fleet, T. Pajdla, B. Schiele, and T. Tuytelaars, 740–755. Cham, Switzerland: Springer.
Manacorda, G., et al. 2014. “A bore-head GPR for horizontal directional drilling (HDD) equipment.” In Proc., 15th Int. Conf. on Ground Penetrating Radar, 745–750. New York: IEEE. https://doi.org/10.1109/icgpr.2014.6970526.
Meituan. 2022. “YOLOv6: A single-stage object detection framework dedicated to industrial applications, Git code.” Accessed July 7, 2023. https://github.com/meituan/YOLOv6.
Molyneaux, T. C. K., S. G. Millard, J. H. Bungey, and J. Q. Zhou. 1995. “Radar assessment of structural concrete using neural networks.” NDT & E Int. 28 (5): 281–288. https://doi.org/10.1016/0963-8695(95)00027-U.
Redmon, J., S. Divvala, R. Girshick, and A. Farhadi. 2016. “You only look once: Unified, real-time object detection.” In Vol. 7 of Proc., IEEE Conf. on Computer Vision and Pattern Recognition, 79–788. New York: IEEE.
Redmon, J., and A. Farhadi. 2017. “YOLO9000: Better, faster, stronger.” In Proc., IEEE Conf. on Computer Vision and Pattern Recognition, 6517–6525. New York: IEEE.
Redmon, J., and A. Farhadi. 2018. “Yolov3: An incremental improvement.” Preprint, submitted May 8, 2018. http://arxiv.org/abs/1804.02767.
Ren, S., K. He, R. Girshick, and J. Sun. 2016. “Faster R-CNN: Towards real-time object detection with region proposal networks.” Preprint, submitted November 26, 2020. http://arxiv.org/abs/1506.01497.
Travassos, L. X., and M. F. Pantoja. 2019. “Ground penetrating radar.” In Handbook of advanced nondestructive evaluation, edited by N. Ida, and N. Meyendorf, 987–1023. Cham, Switzerland: Springer.
Tzutalin, LabelImg, Git Code. 2015. “Label studio.” Accessed July 7, 2023. https://github.com/tzutalin/labelImg.
Vejdannik, M., A. Sadr, V. H. C. de Albuquerque, and J. M. R. Tavares. 2019. “Signal processing for NDE.” In Handbook of advanced non-destructive evaluation, 1525–1543. Berlin: Springer. https://doi.org/10.1007/978-3-319-26553-7_53.
Wang, C. Y., A. Bochkovskiy, and H. Y. M. Liao. 2022. “YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors.” Preprint, submitted July 11, 2022. http://arxiv.org/abs/2207.02696.
Wang, H., G. Yang, E. Li, Y. Tian, M. Zhao, and Z. Liang. 2019. “High-voltage power transmission tower detection based on faster R-CNN and YOLO-V3” In Proc., 2019 Chinese Control Conf. (CCC), 8750–8755. New York: IEEE. https://doi.org/10.23919/chicc.2019.8866322.
Zong, Z., C. Chen, X. Mi, W. Sun, Y. Song, J. Li, Z. Dong, R. Huang, and B. Yang. 2019. “A Deep learning approach for urban underground objects detection from vehicle-borne ground penetrating radar data in real-time.” Int. Archives of the Photogramm. : Remote Sens. Spatial Inf. Sci. XLII-2/W16 (Sep): 293–299. https://doi.org/10.5194/isprs-archives-xlii-2-w16-293-2019.

Information & Authors

Information

Published In

Go to Journal of Pipeline Systems Engineering and Practice
Journal of Pipeline Systems Engineering and Practice
Volume 14Issue 4November 2023

History

Received: Oct 18, 2022
Accepted: Jul 14, 2023
Published online: Sep 11, 2023
Published in print: Nov 1, 2023
Discussion open until: Feb 11, 2024

Permissions

Request permissions for this article.

Authors

Affiliations

Leila Carolina Martoni Amaral [email protected]
Dept. of Civil and Environmental Engineering, Univ. of Alberta, 9211 116 St. NW, Edmonton, AB, Canada T6G 1H9. Email: [email protected]
Aditya Roshan, Ph.D. [email protected]
Dept. of Civil and Environmental Engineering, Univ. of Alberta, 9211 116th St., Edmonton, AB, Canada T6G 1H9. Email: [email protected]
Alireza Bayat, Ph.D., P.Eng., M.ASCE [email protected]
Professor, Dept. of Civil and Environmental Engineering, Univ. of Alberta, 9211 116th St., Edmonton, AB, Canada T6G 1H9 (corresponding author). Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share