Technical Papers
Apr 16, 2024

Robust Alignment of UGV Perspectives with BIM for Inspection in Indoor Environments

Publication: Journal of Computing in Civil Engineering
Volume 38, Issue 4

Abstract

Ensuring the alignment of perspectives between unmanned ground vehicles (UGVs) and Building Information Modeling (BIM) is crucial for the precise retrieval and analysis of BIM-stored information during inspection tasks. However, accumulative localization errors often result in deviations between the viewpoints of UGV cameras and their corresponding representations in BIM at specific waypoints. Therefore, this study introduces a sequential rectification method to correct the UGV’s pose within the BIM environment to ensure seamless alignment of perspectives. By leveraging visual features and geometric strategies in sequence, this method correlates the UGV-captured point cloud data with the BIM, thereby deriving accurate and robust pose rectification. Experimental validation in a featureless indoor environment demonstrated that this method reduced the angle and distance error of reference lines in two-dimensional (2D) views to approximately 2° and 7 pixels, respectively, and the root mean square error (RMSE) of three-dimensional (3D) lines to approximately 17 cm. The validation also demonstrated that the proposed method was particularly robust in correcting the pose and improving the alignment of perspectives between BIM and the UGV, even in cases of significant misalignment. Hence, this study improves the reliability of decisions made by UGVs for indoor inspection when cross-referencing with BIM data.

Practical Applications

UGVs have been leveraged to automate inspection tasks, thereby reducing human participation. Generally, these UGVs are programmed to perform analyses by comparing the as-built condition with the as-designed BIM. However, for these tasks to accurately reference the data stored in BIM, it is crucial to have a seamless alignment between the UGV’s perspectives and the as-designed BIM. Unfortunately, due to the accumulation of localization errors, discrepancies often arise between the perspectives in BIM and UGV cameras at waypoints. Such misalignments can compromise the reliability of decisions made for assigned tasks compared with BIM, such as verifying measurements of elements or confirming equipment installation during the project handover phase. Hence, this study, having demonstrated superior performance and robustness, could provide an effective solution to this alignment challenge. By ensuring seamlessly aligned perspectives, this method improves the reliability of UGV inspection assessments when accessing corresponding data stored in BIM.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Some or all data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request, including images, BIM, point cloud models, and code used in the validation study.

Acknowledgments

This research is supported by BCA and National Robotics Programme under its Built Environment Robotics R&D Programme (Grant award reference no. W2122d0154). Any opinions, findings and conclusions or recommendations expressed in this material are those of the authors and do not reflect the views of the BCA and National Robotics Programme.

References

Acharya, D., K. Khoshelham, and S. Winter. 2019. “BIM-PoseNet: Indoor camera localisation using a 3D indoor model and deep learning from synthetic images.” ISPRS J. Photogramm. Remote Sens. 150 (Apr): 245–258. https://doi.org/10.1016/j.isprsjprs.2019.02.020.
Acharya, D., S. Singha Roy, K. Khoshelham, and S. Winter. 2020. “A recurrent deep network for estimating the pose of real indoor images from synthetic image sequences.” Sensors 20 (19): 5492. https://doi.org/10.3390/s20195492.
Adán, A., B. Quintana, S. A. Prieto, and F. Bosché. 2020. “An autonomous robotic platform for automatic extraction of detailed semantic models of buildings.” Autom. Constr. 109 (Jan): 102963. https://doi.org/10.1016/j.autcon.2019.102963.
Arun, K. S., T. S. Huang, and S. D. Blostein. 1987. “Least-squares fitting of two 3-D point sets.” IEEE Trans. Pattern Anal. Mach. Intell. PAMI-9 (5): 698–700. https://doi.org/10.1109/TPAMI.1987.4767965.
Asadi, K., H. Ramshankar, M. Noghabaei, and K. Han. 2019. “Real-time image localization and registration with BIM using perspective alignment for indoor monitoring of construction.” J. Comput. Civ. Eng. 33 (5): 04019031. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000847.
Bosché, F., M. Ahmed, Y. Turkan, C. T. Haas, and R. Haas. 2015. “The value of integrating Scan-to-BIM and Scan-vs-BIM techniques for construction monitoring using laser scanning and BIM: The case of cylindrical MEP components.” Autom. Constr. 49 (Part B): 201–213. https://doi.org/10.1016/j.autcon.2014.05.014.
Bruno, H. M. S., and E. L. Colombini. 2021. “LIFT-SLAM: A deep-learning feature-based monocular visual SLAM method.” Neurocomputing 455 (Sep): 97–110. https://doi.org/10.1016/j.neucom.2021.05.027.
Chen, J., S. Li, D. Liu, and W. Lu. 2022a. “Indoor camera pose estimation via style-transfer 3D models.” Comput.-Aided Civ. Infrastruct. Eng. 37 (3): 335–353. https://doi.org/10.1111/mice.12714.
Chen, J., S. Li, and W. Lu. 2022b. “Align to locate: Registering photogrammetric point clouds to BIM for robust indoor localization.” Build. Environ. 209 (Feb): 108675. https://doi.org/10.1016/j.buildenv.2021.108675.
Chen, Y., and G. Medioni. 1992. “Object modelling by registration of multiple range images.” Image Vis. Comput. 10 (3): 145–155. https://doi.org/10.1016/0262-8856(92)90066-C.
Dusmanu, M., I. Rocco, T. Pajdla, M. Pollefeys, J. Sivic, A. Torii, and T. Sattler. 2019. “D2-Net: A trainable CNN for joint detection and description of local features.” In Proc., 2019 IEEE/CVF Conf. on Computer Vision and Pattern Recognition. New York: IEEE.
Golparvar-Fard, M., F. Peña-Mora, and S. Savarese. 2015. “Automated progress monitoring using unordered daily construction photographs and IFC-based building information models.” J. Comput. Civ. Eng. 29 (1): 04014025. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000205.
Ha, I., H. Kim, S. Park, and H. Kim. 2018. “Image retrieval using BIM and features from pretrained VGG network for indoor localization.” Build. Environ. 140 (Aug): 23–31. https://doi.org/10.1016/j.buildenv.2018.05.026.
Hamledari, H., E. Rezazadeh Azar, and B. McCabe. 2018. “IFC-based development of as-built and as-is BIMs using construction and facility inspection data: Site-to-BIM data transfer automation.” J. Comput. Civ. Eng. 32 (2): 04017075. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000727.
Han, K., and M. Golparvar-Fard. 2016. “Potential of big visual data and building information modeling for construction performance analytics: An exploratory study.” Autom. Constr. 73 (Jan): 184–198. https://doi.org/10.1016/j.autcon.2016.11.004.
Han, K. K., and M. Golparvar-Fard. 2015. “Appearance-based material classification for monitoring of operation-level construction progress using 4D BIM and site photologs.” Autom. Constr. 53 (May): 44–57. https://doi.org/10.1016/j.autcon.2015.02.007.
Ibrahim, Y. M., T. C. Lukins, X. Zhang, E. Trucco, and A. P. Kaka. 2009. “Towards automated progress assessment of workpackage components in construction projects using computer vision.” Adv. Eng. Inf. 23 (1): 93–103. https://doi.org/10.1016/j.aei.2008.07.002.
Kadambi, A., A. Bhandari, and R. Raskar. 2014. 3D depth cameras in vision: Benefits and limitations of the hardware. Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-08651-4_1.
Kaiser, T., C. Clemen, and H.-G. Maas. 2022. “Automatic co-registration of photogrammetric point clouds with digital building models.” Autom. Constr. 134 (Feb): 104098. https://doi.org/10.1016/j.autcon.2021.104098.
Kato, S., E. Takeuchi, Y. Ishiguro, Y. Ninomiya, K. Takeda, and T. Hamada. 2015. “An open approach to autonomous vehicles.” IEEE Micro 35 (6): 60–68. https://doi.org/10.1109/MM.2015.133.
Kendall, A., and R. Cipolla. 2016. “Modelling uncertainty in deep learning for camera relocalization.” In Proc., 2016 IEEE Int. Conf. on Robotics and Automation (ICRA), 4762–4769. New York: IEEE. https://doi.org/10.1109/ICRA.2016.7487679.
Kendall, A., M. K. Grimes, and R. Cipolla. 2015. “PoseNet: A convolutional network for real-time 6-DOF camera relocalization.” In Proc., 2015 IEEE Int. Conf. on Computer Vision (ICCV), 2938–2946. New York: IEEE.
Kim, C., H. Son, and C. Kim. 2013. “Fully automated registration of 3D data to a 3D CAD model for project progress monitoring.” Autom. Constr. 35 (Nov): 587–594. https://doi.org/10.1016/j.autcon.2013.01.005.
Kropp, C., C. Koch, and M. König. 2018. “Interior construction state recognition with 4D BIM registered image sequences.” Autom. Constr. 86 (Feb): 11–32. https://doi.org/10.1016/j.autcon.2017.10.027.
Lahoud, J., and B. Ghanem. 2017. “2D-Driven 3D object detection in RGB-D images.” In Proc., 2017 IEEE Int. Conf. on Computer Vision (ICCV), 4632–4640. New York: IEEE. https://doi.org/10.1109/ICCV.2017.495.
Lei, L., Y. Zhou, H. Luo, and P. E. Love. 2019. “A CNN-based 3D patch registration approach for integrating sequential models in support of progress monitoring.” Adv. Eng. Inf. 41 (Aug): 100923. https://doi.org/10.1016/j.aei.2019.100923.
Leung, S.-W., S. Mak, and B. L. P. Lee. 2008. “Using a real-time integrated communication system to monitor the progress and quality of construction works.” Autom. Constr. 17 (6): 749–757. https://doi.org/10.1016/j.autcon.2008.02.003.
Li, J., and G. H. Lee. 2021. “DeepI2P: Image-to-point cloud registration via deep classification.” In Proc., 2021 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), 15955–15964. New York: IEEE.
Liang, H., and J. K. Yeoh. 2022. “Accurate matching between BIM-rendered and real-world images.” In Proc., 1st Future of Construction Workshop at the Int. Conf. on Robotics and Automation (ICRA 2022), 25–27. Gyeonggi-do, Korea: Korea Institute of Construction Engineering and Management. https://doi.org/10.22260/ICRA2022/0009.
Liang, H., J. K. W. Yeoh, and D. K. H. Chua. 2022. “Progress-oriented waypoint sampling for unmanned ground vehicle mission planning.” In Proc., 22nd Int. Conf. on Construction Applications of Virtual Reality. London: SpringerOpen.
Lin, C.-C., Y.-C. Tai, J.-J. Lee, and Y.-S. Chen. 2017. “A novel point cloud registration using 2D image features.” EURASIP J. Adv. Signal Process. 2017 (1): 5. https://doi.org/10.1186/s13634-016-0435-y.
Lin, J. J., J. Y. Lee, and M. Golparvar-Fard. 2019. “Exploring the potential of image-based 3D geometry and appearance reasoning for automated construction progress monitoring.” In Proc., Int. Conf. of Computing in Civil Engineering, 162–170. Reston, VA: ASCE. https://doi.org/10.1061/9780784482438.021.
Ma, Z., S. Cai, N. Mao, Q. Yang, J. Feng, and P. Wang. 2018. “Construction quality management based on a collaborative system using BIM and indoor positioning.” Autom. Constr. 92 (Aug): 35–45. https://doi.org/10.1016/j.autcon.2018.03.027.
Mur-Artal, R., and J. D. Tardós. 2017. “ORB-SLAM2: An open-source SLAM system for monocular, stereo, and RGB-D cameras.” IEEE Trans. Rob. 33 (5): 1255–1262. https://doi.org/10.1109/TRO.2017.2705103.
Pal, A., J. J. Lin, S.-H. Hsieh, and M. Golparvar-Fard. 2023. “Automated vision-based construction progress monitoring in built environment through digital twin.” Dev. Built Environ. 16 (Dec): 100247. https://doi.org/10.1016/j.dibe.2023.100247.
Panigrahi, P. K., and S. K. Bisoy. 2022. “Localization strategies for autonomous mobile robots: A review.” J. King Saud Univ.-Comput. Info. Sci. 34 (8): 6019–6039. https://doi.org/10.1016/j.jksuci.2021.02.015.
Peng, S., Y. Liu, Q.-X. Huang, H. Bao, and X. Zhou. 2018. “PVNet: Pixel-wise voting network for 6DoF pose estimation.” In Proc., 2019 IEEE/CVF Conf. on Computer Vision and Pattern Recognition (CVPR), 4556–4565. New York: IEEE.
Piasco, N., D. Sidibé, C. Demonceaux, and V. Gouet-Brunet. 2018. “A survey on Visual-Based Localization: On the benefit of heterogeneous data.” Pattern Recognit. 74 (Feb): 90–109. https://doi.org/10.1016/j.patcog.2017.09.013.
Sattler, T., et al. 2018. “Benchmarking 6DOF outdoor visual localization in changing conditions.” In Proc., 2018 IEEE/CVF Conf. on Computer Vision and Pattern Recognition, 8601–8610. New York: IEEE.
Sheik, N. A., G. Deruyter, and P. Veelaert. 2022a. “Plane-based robust registration of a building scan with its BIM.” Remote Sens. 14 (9): 1979. https://doi.org/10.3390/rs14091979.
Sheik, N. A., P. Veelaert, and G. Deruyter. 2022b. “Registration of building scan with IFC-based BIM using the corner points.” Remote Sens. 14 (20): 5271. https://doi.org/10.3390/rs14205271.
Turkan, Y., F. Bosche, C. T. Haas, and R. Haas. 2012. “Automated progress tracking using 4D schedule and 3D sensing technologies.” Autom. Constr. 22 (Mar): 414–421. https://doi.org/10.1016/j.autcon.2011.10.003.
Wang, T. H., A. Pal, J. J. Lin, and S.-H. Hsieh. 2023. “Construction photo localization in 3D reality models for vision-based automated daily project monitoring.” J. Comput. Civ. Eng. 37 (6): 04023029. https://doi.org/10.1061/JCCEE5.CPENG-5353.
Yin, H., Z. Lin, and J. K. W. Yeoh. 2023. “Semantic localization on BIM-generated maps using a 3D LiDAR sensor.” Autom. Constr. 146 (Feb): 104641. https://doi.org/10.1016/j.autcon.2022.104641.
Zhang, W., C. Sun, and Y. Gao. 2023. “Image intensity variation information for interest point detection.” IEEE Trans. Pattern Anal. Mach. Intell. 45 (8): 9883–9894. https://doi.org/10.1109/TPAMI.2023.3240129.

Information & Authors

Information

Published In

Go to Journal of Computing in Civil Engineering
Journal of Computing in Civil Engineering
Volume 38Issue 4July 2024

History

Received: Oct 3, 2023
Accepted: Jan 8, 2024
Published online: Apr 16, 2024
Published in print: Jul 1, 2024
Discussion open until: Sep 16, 2024

Permissions

Request permissions for this article.

Authors

Affiliations

Ph.D. Candidate, Dept. of Civil and Environmental Engineering, College of Design and Engineering, National Univ. of Singapore, Singapore 117576 (corresponding author). ORCID: https://orcid.org/0000-0003-3491-3281. Email: [email protected]
Senior Lecturer, Dept. of Civil and Environmental Engineering, College of Design and Engineering, National Univ. of Singapore, Singapore 117576. ORCID: https://orcid.org/0000-0003-2783-303X. Email: [email protected]
David K. H. Chua, M.ASCE [email protected]
Professor, Dept. of Civil and Environmental Engineering, College of Design and Engineering, National Univ. of Singapore, Singapore 117576. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share