Identifying Modular Construction Worker Tasks Using Computer Vision
Publication: Computing in Civil Engineering 2021
ABSTRACT
Modular construction is increasingly being seen as an attractive method for delivering building projects due to advantages in safety, quality, and lead-time. Despite these benefits, this method still relies heavily on human labor, which causes variability in factory assembly-line performance that can erode performance benefits of modular construction. Continuous improvement methods can alleviate some of these issues, but they also require continuous monitoring of human workers’ performance. Due to limitations of manual time study and automated sensor-based monitoring methods, recently computer vision-based methods have gained momentum in identifying the activities of construction workers from the videos of onsite construction. Therefore, this paper explores the use of computer vision-based human activity recognition techniques to identify and classify worker activities in modular construction videos. Computer vision-based tracking method has been used to track the human workers in each frame, and Resnet-50 network has been used to classify the activity of tracked workers. Evaluation of this framework has achieved higher than 90% accuracy and recall in testing.
Get full access to this article
View all available purchase options and get full access to this chapter.
REFERENCES
Akhavian, R., and Behzadan, A. H. (n.d.). Coupling human activity recognition and wearable sensors for data-driven construction simulation. 15.
Akhavian, R., and Behzadan, A. H. (2016). Smartphone-based construction workers’ activity recognition and classification. Automation in Construction, 71, 198–209. https://doi.org/10.1016/j.autcon.2016.08.015.
Arashpour, M., and Wake, R. (2015). Autonomous production tracking for augmenting output in off-site construction. Automation in Construction, 9.
Bangaru, S. S., Wang, C., Busam, S. A., and Aghazadeh, F. (2021). ANN-based automated scaffold builder activity recognition through wearable EMG and IMU sensors. Automation in Construction, 126, 103653. https://doi.org/10.1016/j.autcon.2021.103653.
Bertram, N., Fuchs, S., Mischke, J., Palter, R., Strube, G., and Woetzel, J. (2019). Modular construction: From projects to products.
Cai, J., Zhang, Y., and Cai, H. (2019). Two-step long short-term memory method for identifying construction activities through positional and attentional cues. Automation in Construction, 106, 102886. https://doi.org/10.1016/j.autcon.2019.102886.
Cheng, C.-F., Rashidi, A., Davenport, M. A., and Anderson, D. V. (2017). Activity analysis of construction equipment using audio signals and support vector machines. Automation in Construction, 81, 240–253. https://doi.org/10.1016/j.autcon.2017.06.005.
Cheok, G. S., Stone, W. C., Lipman, R. R., and Witzgall, C. (2000). Ladars for construction assessment and update. Automation in Construction, 9(5–6), 463–477. https://doi.org/10.1016/S0926-5805(00)00058-3.
Eastman, C. M., and Sacks, R. (2008). Relative Productivity in the AEC Industries in the United States for On-Site and Off-Site Activities. Journal of Construction Engineering and Management, 134(7), 517–526. https://doi.org/10.1061/(ASCE)0733-9364(2008)134:7(517).
Escorcia, V., Dávila, M. A., Golparvar-Fard, M., and Niebles, J. C. (2012). Automated Vision-Based Recognition of Construction Worker Actions for Building Interior Construction Operations Using RGBD Cameras. Construction Research Congress 2012, 879–888. https://doi.org/10.1061/9780784412329.089.
Gong, J., and Caldas, C. H. (2011). Learning and Classifying Motions of Construction Workers and Equipment Using Bag of Video Feature Words and Bayesian Learning Methods. Computing in Civil Engineering (2011), 274–281. https://doi.org/10.1061/41182(416)34.
Khosrowpour, A., Niebles, J. C., and Golparvar-Fard, M. (2014). Vision-based workface assessment using depth images for activity analysis of interior construction operations. Automation in Construction, 48, 74–87. https://doi.org/10.1016/j.autcon.2014.08.003.
Luo, H., Xiong, C., Fang, W., Love, P. E. D., Zhang, B., and Ouyang, X. (2018). Convolutional neural networks: Computer vision-based workforce activity assessment in construction. Automation in Construction, 94, 282–289. https://doi.org/10.1016/j.autcon.2018.06.007.
Luo, X., Li, H., Cao, D., Dai, F., Seo, J., and Lee, S. (2018). Recognizing Diverse Construction Activities in Site Images via Relevance Networks of Construction-Related Objects Detected by Convolutional Neural Networks. Journal of Computing in Civil Engineering, 32(3), 04018012. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000756.
Luo, X., Li, H., Cao, D., Yu, Y., Yang, X., and Huang, T. (2018). Towards efficient and objective work sampling: Recognizing workers’ activities in site surveillance videos with two-stream convolutional networks. Automation in Construction, 94, 360–370. https://doi.org/10.1016/j.autcon.2018.07.011.
Mani, N., Kisi, K. P., Rojas, E. M., and Foster, E. T. (2017). Estimating Construction Labor Productivity Frontier: Pilot Study. Journal of Construction Engineering and Management, 143(10), 04017077. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001390.
McCulloch, B. (1997). Automating Field Data Collection in Construction Organizations. 957–963. https://cedb.asce.org/CEDBsearch/record.jsp?dockey=0108308.
Rashid, K. M., and Louis, J. (2019). Times-series data augmentation and deep learning for construction equipment activity recognition. Advanced Engineering Informatics, 42, 100944. https://doi.org/10.1016/j.aei.2019.100944.
Rashid, K. M., and Louis, J. (2020). Activity identification in modular construction using audio signals and machine learning. Automation in Construction, 119, 103361. https://doi.org/10.1016/j.autcon.2020.103361.
Thomas, H. R., Maloney, W. F., Horner, R. M. W., Smith, G. R., Handa, V. K., and Sanders, S. R. (1990). Modeling Construction Labor Productivity. Journal of Construction Engineering and Management, 116(4), 705–726. https://doi.org/10.1061/(ASCE)0733-9364(1990)116:4(705).
Wu, C., Wang, X., Chen, M., and Kim, M. J. (2019). Differential received signal strength based RFID positioning for construction equipment tracking. Advanced Engineering Informatics, 42, 100960. https://doi.org/10.1016/j.aei.2019.100960.
Yang, J., Shi, Z., and Wu, Z. (2016). Vision-based action recognition of construction workers using dense trajectories. Advanced Engineering Informatics, 30(3), 327–336. https://doi.org/10.1016/j.aei.2016.04.009.
Yu, Y., Li, H., Yang, X., and Umer, W. (2018, July 22). Estimating Construction Workers’ Physical Workload by Fusing Computer Vision and Smart Insole Technologies. 34th International Symposium on Automation and Robotics in Construction, Taipei, Taiwan. https://doi.org/10.22260/ISARC2018/0168.
Zhang, L., and van der Maaten, L. (2014). Preserving Structure in Model-Free Tracking. IEEE Transactions on Pattern Analysis and Machine Intelligence, 36(4), 756–769. https://doi.org/10.1109/TPAMI.2013.221.
Information & Authors
Information
Published In
History
Published online: May 24, 2022
Authors
Metrics & Citations
Metrics
Citations
Download citation
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.
Cited by
- Abdolmajid Erfani, Qingbin Cui, Application of LinkedIn Data and Image Processing to Analyze Construction Career Paths: Does Race Matter?, Computing in Civil Engineering 2023, 10.1061/9780784485248.106, (882-889), (2024).
- Zherui Shao, Yang Miang Goh, Jing Tian, Yu Guang Lim, Vincent Jie Long Gan, Computer Vision-Based Monitoring of Construction Site Housekeeping: An Evaluation of CNN and Transformer-Based Models, Computing in Civil Engineering 2023, 10.1061/9780784485248.061, (508-515), (2024).
- Roshan Panahi, John-Paul Kivlin, Joseph Louis, Request for Information (RFI) Recommender System for Pre-Construction Design Review Application Using Natural Language Processing, Chat-GPT, and Computer Vision, Computing in Civil Engineering 2023, 10.1061/9780784485224.020, (159-166), (2024).