Chapter
Jan 25, 2024

Fully Autonomous Fire Safety Equipment Inspection Missions on a Legged Robot

Publication: Computing in Civil Engineering 2023

ABSTRACT

As building automation becomes more prevalent, smart buildings will be integrated into smart cities. Using artificial intelligence (AI) is expected to increase efficiency and the automation level throughout the whole life cycle of building information modeling (BIM). Especially for the management phase of a building, facility managers are recognizing the value of machine learning and robotics for the automation of different maintenance tasks. In the fire safety management field, documentation of fire safety equipment (FSE) is required due to recurring maintenance work, system changes, and relocations. This study concentrates on the automatic detection of inspection tags on FSE using a YOLO network and its deployment on a mobile robot. The goal is to create a fully autonomous inspection mission for a legged robot by executing predefined routes, identifying FSE, and extracting necessary information, such as the next maintenance date tags.

Get full access to this article

View all available purchase options and get full access to this chapter.

REFERENCES

Bayer, H., and Aziz, A. (2022). “Object Detection of Fire Safety Equipment in Images and Videos Using Yolov5 Neural Network.” Proceedings of 33. Forum Bauinformatik, 2022.
Cadena, C., Dick, A., and Reid, I. D. (2015). “A fast, modular scene understanding system using context-aware object detection.” 2015 IEEE International Conference 2015, 4859–4866.
Corneli, A., Naticchia, B., Vaccarini, M., Bosché, F., and Carbonari, A. (2020). “Training of YOLO Neural Network for the Detection of Fire Emergency Assets.” Proceedings of the 37th ISARC, 2020, 836–843.
Ferguson, M., and Law, K. (2019). “A 2D-3D Object Detection System for Updating Building Information Models with Mobile Robots.” 2019 IEEE Winter Conference, 1357–1365.
Glenn, J. (2022). “YOLOv5 release v6.1.” <https://github.com/ultralytics/yolov5/releases/tag/v6.1>(Mar. 13, 2023).
Guo, R., Han, L., Sun, Y., and Wang, M. (2010). “A mobile robot for inspection of substation equipments.” 2010 1st International Conference on Applied Robotics for the Power Industry (CARPI 2010), IEEE, 1–5.
Hall, D., Talbot, B., Bista, S. R., Zhang, H., Smith, R., Dayoub, F., and Sünderhauf, N. (2020). The Robotic Vision Scene Understanding Challenge,.
Han, K. K., and Golparvar-Fard, M. (2015). “Appearance-based material classification for monitoring of operation-level construction progress using 4D BIM and site photologs.” Automation in Construction, 53, 44–57.
Hernández, A. C., Gómez, C., Crespo, J., and Barber, R. (2016). “Object Detection Applied to Indoor Environments for Mobile Robot Navigation.” Sensors, 16(8), 1180.
Kleeberger, K., Bormann, R., Kraus, W., and Huber, M. F. (2020). “A Survey on Learning-Based Robotic Grasping.” Curr Robot Rep, 1, 239–249.
Levine, S., Pastor, P., Krizhevsky, A., and Quillen, D. (2018). “Learning Hand-Eye Coordination for Robotic Grasping with Deep Learning and Large-Scale Data Collection.” The International journal of robotics research 2018, (37), 421–436.
Lin, D., Fidler, S., and Urtasun, R. (2013). “Holistic Scene Understanding for 3D Object Detection with RGBD Cameras.” 2013 IEEE International Conference on Computer Vision, IEEE, 1417–1424.
Mousavian, A., Eppner, C., and Fox, D. (2019). “6-DOF GraspNet: Variational Grasp Generation for Object Manipulation.” 2019 IEEE/CVF International Conference on Computer Vision (ICCV), IEEE, 2901–2910.
Riedlinger, M., Völk, M., Kleeberger, K., Khalid, M. U., and Bormann, R. (2020). “Model-Free Grasp Learning Framework based on Physical Simulation.” International Symposium on Robotics (ISR). Munich, Germany, 2020.
Wang, C.-Y., Bochkovskiy, A., and Liao, H.-Y. M. (2022). “YOLOv7: Trainable bag-of-freebies sets new state-of-the-art for real-time object detectors.”.
Yan, X., Hsu, J., Khansari, M., Bai, Y., Pathak, A., Gupta, A., Davidson, J., and Lee, H. (2018). “Learning 6-DOF Grasping Interaction via Deep Geometry-aware 3D Representations.” 2018 IEEE International Conference on Robotics and Automation (ICRA), 3766–3773.
Yousif, K., Bab-Hadiashar, A., and Hoseinnezhad, R. (2015). “An Overview to Visual Odometry and Visual SLAM: Applications to Mobile Robotics.” Intell Ind Syst, 1(4), 289–311.
Yuan, C., Xiong, B., Li, X., Sang, X., and Kong, Q. (2022). “A novel intelligent inspection robot with deep stereo vision for three-dimensional concrete damage detection and quantification.” Structural Health Monitoring, 21(3), 788–802.

Information & Authors

Information

Published In

Go to Computing in Civil Engineering 2023
Computing in Civil Engineering 2023
Pages: 804 - 812

History

Published online: Jan 25, 2024

Permissions

Request permissions for this article.

ASCE Technical Topics:

Authors

Affiliations

Angelina Aziz [email protected]
1Computing in Engineering, Dept. of Civil and Environmental Engineering, Ruhr Univ. Bochum. Email: [email protected]
Patrick Herbers [email protected]
2Computing in Engineering, Dept. of Civil and Environmental Engineering, Ruhr Univ. Bochum. Email: [email protected]
Hakan Bayer [email protected]
3Computing in Engineering, Dept. of Civil and Environmental Engineering, Ruhr Univ. Bochum. Email: [email protected]
Markus König [email protected]
4Computing in Engineering, Dept. of Civil and Environmental Engineering, Ruhr Univ. Bochum. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$198.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$198.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share