Technical Papers
Sep 15, 2021

Fast Blur Detection Algorithm for UAV Crack Image Sets

Publication: Journal of Computing in Civil Engineering
Volume 35, Issue 6

Abstract

Unmanned aerial vehicles (UAVs) have been widely used in the visual inspection of structural cracks. However, blurry images are inevitably generated during image collecting by UAVs, as they are caused by the motion of UAVs and other factors. This blur affects the retrieval of crack properties from images and degrades the accuracy and reliability of crack damage assessment. At present, blur detection and blurred image removal are mainly achieved manually, which is inefficient and fallible, especially for large image sets. To address this problem, a novel automatic blur detection method for UAV crack image data sets is proposed. This algorithm defines a blur detection metric named the edge average width difference (EAWD), which is based on the principle of a smaller difference between pixels of a more blurred image. Moreover, it is combined with the characteristics of the crack image itself. By calculating this metric and comparing it with other EAWD values from the same data set, the crack images are judged to be blurred or not. Furthermore, a support vector machine classifier is applied to the aforementioned metrics, serving as the image blur quality evaluator. For proper training and assessment of the proposed approach, an image data set consisting of 1,200 crack images is created, which also contains some thin crack images. Several experimental results are provided in this paper to demonstrate that the proposed method is fast, accurate, and reliable.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Some or all data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request, including crack image data set and codes for implemented models.

Acknowledgments

The research is supported by the National Key R & D Program of China (2016YFC0401600 and 2017YFC0404900), the National Natural Science Foundation of China (51979027, 51769033 and 51779035).

References

Adhikari, R. S., O. Moselhi, and A. Bagchi. 2014. “Image-based retrieval of concrete crack properties for bridge inspection.” Autom. Constr. 23 (39): 180–194. https://doi.org/10.1016/j.autcon.2013.06.011.
Banitalebi-Dehkordi, M., A. Ebrahimi-Moghadam, M. Khademi, and H. Hadizadeh. 2020. “No-reference video quality assessment based on visual memory modeling.” IEEE Trans. Broadcast. 66 (3): 676–689. https://doi.org/10.1109/TBC.2019.2957670.
Crété-Roffet, F., T. Dolmière, P. Ladret, and M. Nicolas. 2007. “The blur effect: Perception and estimation with a new no-reference perceptual blur metric.” In Proc., 2007 SPIE 6492, Human Vision and Electronic Imaging XII; 649201. San Jose, CA: SPIE. https://doi.org/10.1117/12.702790.
Dendi, S. V. R., C. Dev, N. Kothari, and S. S. Channappayya. 2019. “Generating image distortion maps using convolutional autoencoders with application to no reference image quality assessment.” IEEE Signal Process. Lett. 26 (1): 89–93. https://doi.org/10.1109/LSP.2018.2879518.
Fang, Y., K. Ma, Z. Wang, W. Lin, Z. Fang, and G. Zhai. 2015. “No-reference quality assessment of contrast-distorted images based on natural scene statistics.” IEEE Signal Process. Lett. 22 (7): 838–842. https://doi.org/10.1109/LSP.2014.2372333.
Ferzli, R., and L. J. Karam. 2009. “A no-reference objective image sharpness metric based on the notion of just noticeable blur (JNB).” IEEE Trans. Image Process. 18 (4): 717–728. https://doi.org/10.1109/TIP.2008.2011760.
Haralick, R. M., S. R. Sternberg, and X. Zhuang. 1987. “Image analysis using mathematical morphology.” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-9 (4): 532–550. https://doi.org/10.1109/TPAMI.1987.4767941.
Hassen, R., Z. Wang, and M. M. A. Salama. 2013. “Image sharpness assessment based on local phase coherence.” IEEE Trans. Image Process. 22 (7): 2798–2810. https://doi.org/10.1109/TIP.2013.2251643.
Hsieh, Y.-A., and Y. J. Tsai. 2020. “Machine learning for crack detection: Review and model performance comparison.” J. Comput. Civ. Eng. 34 (5): 04020038. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000918.
Jahanshahi, M. R., J. S. Kelly, S. F. Masri, and G. S. Sukhatme. 2009. “A survey and evaluation of promising approaches for automatic image-based defect detection of bridge structures.” Struct. Infrastruct. Eng. 5 (6): 455–486. https://doi.org/10.1080/15732470801945930.
Jia, S., and Y. Zhang. 2018. “Saliency-based deep convolutional neural network for no-reference image quality assessment.” Multimedia Tools Appl. 77 (12): 14859–14872. https://doi.org/10.1007/s11042-017-5070-6.
Koch, C., K. Georgieva, V. Kasireddy, B. Akinci, and P. Fieguth. 2015. “A review on computer vision based defect detection and condition assessment of concrete and asphalt civil infrastructure.” Adv. Eng. Inf. 29 (2): 196–210. https://doi.org/10.1016/j.aei.2015.01.008.
Koik, B. T., and H. Ibrahim. 2013. “A literature survey on blur detection algorithms for digital imaging.” In Proc., 2013 1st Int. Conf. on Artificial Intelligence, Modelling and Simulation, 272–277. New York: IEEE. https://doi.org/10.1109/AIMS.2013.50.
Liu, Y., J. Yeoh, and D. Chua. 2020. “Deep learning-based enhancement of motion blurred UAV concrete crack images.” J. Comput. Civil Eng. 34 (5): 04020028. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000907.
Liu, Y.-F., S. Cho, B. F. Spencer, and J.-S. Fan. 2016. “Concrete crack assessment using digital image processing and 3D scene reconstruction.” J. Comput. Civ. Eng. 30 (1): 04014124. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000446.
Ma, J., P. An, L. Shen, and K. Li. 2018. “Reduced-reference stereoscopic image quality assessment using natural scene statistics and structural degradation.” IEEE Access 6 (6): 2768–2780. https://doi.org/10.1109/ACCESS.2017.2785282.
Ma, K., W. Liu, T. Liu, Z. Wang, and D. Tao. 2017. “DipIQ: Blind image quality assessment by learning-to-rank discriminable image pairs.” IEEE Trans. Image Process. 26 (8): 3951–3964. https://doi.org/10.1109/TIP.2017.2708503.
Ma, Y., X. Cai, F. Sun, and S. Hao. 2019. “No-reference image quality assessment based on multi-task generative adversarial network.” IEEE Access 7 (7): 146893–146902. https://doi.org/10.1109/ACCESS.2019.2942625.
Manap, R. A., and L. Shao. 2015. “Non-distortion-specific no-reference image quality assessment: A survey.” Info. Sci. 301 (Apr): 141–160. https://doi.org/10.1016/j.ins.2014.12.055.
Mavridaki, E., and V. Mezaris. 2014. “No-reference blur assessment in natural images using fourier transform and spatial pyramids.” In Proc., 2014 IEEE Int. Conf. on Image Processing (ICIP), 566–570. New York: IEEE. https://doi.org/10.1109/ICIP.2014.7025113.
Mittal, A., A. K. Moorthy, and A. C. Bovik. 2012. “No-reference image quality assessment in the spatial domain.” IEEE Trans. Image Process. 21 (12): 4695–4708. https://doi.org/10.1109/TIP.2012.2214050.
Mohan, A., and S. Poobal. 2018. “Crack detection using image processing: A critical review and analysis.” Alexandria Eng. J. 57 (2): 787–798. https://doi.org/10.1016/j.aej.2017.01.020.
Morgenthal, G., and N. Hallermann. 2014. “Quality assessment of unmanned aerial vehicle (UAV) based visual inspection of structures.” Adv. Struct. Eng. 17 (3): 289–302. https://doi.org/10.1260/1369-4332.17.3.289.
Narvekar, N. D., and L. J. Karam. 2011. “A no-reference image blur metric based on the cumulative probability of blur detection (CPBD).” IEEE Trans. Image Process. 20 (9): 2678–2683. https://doi.org/10.1109/TIP.2011.2131660.
Osman, M. K., M. Y. Mashor, Z. Saad, and H. Jaafar. 2009. “Contrast enhancement for ziehl-neelsen tissue slide images using linear stretching and histogram equalization technique.” In Proc., 2009 IEEE Symp. on Industrial Electronics & Applications, 431–435. New York: IEEE. https://doi.org/10.1109/ISIEA.2009.5356411.
Otsu, N. 1979. “A threshold selection method from gray-level histograms.” IEEE Trans. Syst. Man Cybern. 9 (1): 62–66. https://doi.org/10.1109/TSMC.1979.4310076.
Sieberth, T., R. Wackrow, and J. Chandler. 2014a. “Influence of blur on feature matching and a geometric approach for photogrammetric deblurring.” Int. Arch. Photogramm. Remote Sens. Spatial Inf. Sci. XL-3: 321–326. https://doi.org/10.5194/isprsarchives-XL-3-321-2014.
Sieberth, T., R. Wackrow, and J. Chandler. 2014b. “Motion blur disturbs: The influence of motion-blurred images in photogrammetry.” Photogramm. Rec. 29 (148): 434–453. https://doi.org/10.1111/phor.12082.
Sieberth, T., R. Wackrow, and J. H. Chandler. 2016. “Automatic detection of blurred images in UAV image sets.” ISPRS J. Photogramm. Remote Sens. 122 (Dec): 1–16. https://doi.org/10.1016/j.isprsjprs.2016.09.010.
Takahashi, Y., C. Kuhara, and H. Chikatsu. 2020. Image blur detection method based on gradient information in directional statistics. Gottingen, Germany: Copernicus.
Verhoeven, G. 2011. “Taking computer vision aloft: Archaeological three-dimensional reconstructions from aerial photographs with PhotoScan.” Archaeol. Prospect. 18 (1): 67–73. https://doi.org/10.1002/arp.399.
Wan, Z., K. Gu, and D. Zhao. 2020. “Reduced reference stereoscopic image quality assessment using sparse representation and natural scene statistics.” IEEE Trans. Multimedia 22 (8): 2024–2037. https://doi.org/10.1109/TMM.2019.2950533.
Wang, S., K. Gu, X. Zhang, W. Lin, S. Ma, and W. Gao. 2018. “Reduced-reference quality assessment of screen content images.” IEEE Trans. Circuits Syst. Video Technol. 28 (1): 1–14. https://doi.org/10.1109/TCSVT.2016.2602764.
Xue, W., L. Zhang, X. Mou, and A. C. Bovik. 2014. “Gradient magnitude similarity deviation: A highly efficient perceptual image quality index.” IEEE Trans. Image Process. 23 (2): 684–695. https://doi.org/10.1109/TIP.2013.2293423.
Yi, X., and M. Eramian. 2016. “LBP-based segmentation of defocus blur.” IEEE Trans. Image Process. 25 (4): 1626–1638. https://doi.org/10.1109/TIP.2016.2528042.
Zhang, T. Y., and C. Y. Suen. 1984. “A fast parallel algorithm for thinning digital patterns.” Commun. ACM 27 (3): 236–239. https://doi.org/10.1145/357994.358023.
Zhang, W., K. Ma, J. Yan, D. Deng, and Z. Wang. 2020. “Blind image quality assessment using a deep bilinear convolutional neural network.” IEEE Trans. Circuits Syst. Video Technol. 30 (1): 36–47. https://doi.org/10.1109/TCSVT.2018.2886771.
Zhou, W., A. C. Bovik, H. R. Sheikh, and E. P. Simoncelli. 2004. “Image quality assessment: From error visibility to structural similarity.” IEEE Trans. Image Process. 13 (4): 600–612. https://doi.org/10.1109/TIP.2003.819861.

Information & Authors

Information

Published In

Go to Journal of Computing in Civil Engineering
Journal of Computing in Civil Engineering
Volume 35Issue 6November 2021

History

Received: Apr 5, 2021
Accepted: Jul 28, 2021
Published online: Sep 15, 2021
Published in print: Nov 1, 2021
Discussion open until: Feb 15, 2022

Permissions

Request permissions for this article.

Authors

Affiliations

Ph.D. Candidate, Faculty of Infrastructure Engineering, Dalian Univ. of Technology, Dalian 116024, China (corresponding author). ORCID: https://orcid.org/0000-0001-6328-7529. Email: [email protected]
Professor, Faculty of Infrastructure Engineering, Dalian Univ. of Technology, Dalian 116024, China; Professor, College of Water Conservancy and Hydropower Engineering, Hohai Univ., Nanjing 210098, China. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

  • Embedded PZT aggregates for monitoring crack growth and predicting surface crack in reinforced concrete beam, Construction and Building Materials, 10.1016/j.conbuildmat.2022.129979, 364, (129979), (2023).

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share