Technical Papers
Oct 9, 2024

LSTM-CNN Architecture for Construction Activity Recognition Using Optimal Positioning of Wearables

Publication: Journal of Construction Engineering and Management
Volume 150, Issue 12

Abstract

Enhancing construction worker performance, safety, and project management through automated activity classification is a promising endeavor. By extracting activity-level information, this technology provides valuable insights for informed decision-making, facilitating project schedule adjustments, efficient resource management, and improved construction site control. Previous studies in this domain focused on basic activities, neglecting optimal sensor placement and no regard for worker comfort. This paper extends beyond existing research, encompassing a broader range of complex construction activities and surpassing current methods. Utilizing unobtrusive wearables like a smartwatch and smartphone, the study determines optimal sensor positions (dominant/nondominant wrist, dominant/nondominant leg pocket). Notably, it introduces a novel deep neural network structure, merging long short-term memory (LSTM) and convolutional layers, offering an innovative solution for automated activity classification tasks in the construction industry. This model extracts activity features automatically reducing the need for manual feature engineering and performs classification with few model parameters indicating efficiency in terms of computational resources and memory requirements making the model more suitable for real-time applications and deployment on resource-constrained devices. By leveraging the strengths of both convolutional layers and LSTM, this approach offers a powerful and efficient solution for activity classification tasks. An experimental study was carried out to recognize four different activities: manual excavation, rebar stirrups, cement plastering, and bar binding. These were performed by four subjects (three males and one female) for 30 s each with different positions of smartwatch and smartphone producing 24,080 data points. Results indicate the optimal positioning of wearables to be smartwatch on dominant hand and smartphone on opposite leg pocket because of a balanced and effective coverage of the relevant movements and contextual information yielding 98.18% accuracy, 98.20% precision, 98.17% recall, and F1 score of 98.17%.

Practical Applications

The integration of LSTM and convolutional neural network (CNN) architecture and strategic sensor placement on wearables holds promise for revolutionizing construction project management. By optimizing sensor placement, we can improve the accuracy of activity recognition, allowing for better tracking of worker productivity and process optimization, ultimately leading to increased efficiency and productivity on construction sites. Additionally, strategically placing sensors on wearables enables real-time monitoring of worker movements, facilitating the early identification of safety hazards and ergonomic issues, thereby enhancing worker safety and reducing workplace injuries. Furthermore, integrating wearable technology with smart construction technologies enables real-time data exchange and analysis, promoting collaboration among stakeholders and enhancing progress monitoring and control. Leveraging data from strategically positioned wearables allows for comprehensive visualization of construction processes, aiding in the detection of project bottlenecks and streamlining project management strategies. Lastly, analyzing work allocation data from optimally positioned wearables improves project bidding accuracy, minimizing cost estimation errors and increasing profit margins. These findings highlight the transformative potential of LSTM-CNN architecture and optimal wearable placement in construction project management.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

Data generated or analyzed during the study are available from the corresponding author by request.

References

Akhavian, R., and A. H. Behzadan. 2016. “Smartphone-based construction workers’ activity recognition and classification.” Autom. Constr. 71 (Nov): 198–209. https://doi.org/10.1016/j.autcon.2016.08.015.
Bao, L., and S. S. Intille. 2004. “Activity recognition from user-annotated acceleration data.” In Vol. 3001 of Lecture notes in computer science (including subseries lecture notes in artificial intelligence and lecture notes in bioinformatics), 1–17. Berlin: Springer.
Brilakis, I., M. W. Park, and G. Jog. 2011. “Automated vision tracking of project related entities.” Adv. Eng. Inf. 25 (4): 713–724. https://doi.org/10.1016/j.aei.2011.01.003.
Chen, J., J. Qiu, and C. Ahn. 2017. “Construction worker’s awkward posture recognition through supervised motion tensor decomposition.” Autom. Constr. 77 (May): 67–81. https://doi.org/10.1016/j.autcon.2017.01.020.
Cheng, T., J. Teizer, G. C. Migliaccio, and U. C. Gatti. 2013. “Automated task-level activity analysis through fusion of real time location sensors and worker’s thoracic posture data.” Autom. Constr. 29 (Jan): 24–39. https://doi.org/10.1016/j.autcon.2012.08.003.
Chernbumroong, S., A. S. Atkins, and H. Yu. 2011. “Activity classification using a single wrist-worn accelerometer.” In Proc., SKIMA 2011—5th Int. Conf. on Software, Knowledge Information, Industrial Management and Applications, 1–6. New York: IEEE.
Deep, S., and X. Zheng. 2019. “Hybrid model featuring CNN and LSTM architecture for human activity recognition on smartphone sensor data.” In Proc., 2019 20th Int. Conf. on Parallel and Distributed Computing, Applications and Technologies, PDCAT 2019, 259–264. New York: IEEE.
Donahue, J., L. Anne Hendricks, M. Rohrbach, S. Venugopalan, S. Guadarrama, K. Saenko, and T. Darrell. 2017. “Long-term recurrent convolutional networks for visual recognition and description.” IEEE Trans. Pattern Anal. Mach. Intell. 39 (4): 677–691. https://doi.org/10.1109/TPAMI.2016.2599174.
Dua, N., S. N. Singh, and V. B. Semwal. 2021. “Multi-input CNN-GRU based human activity recognition using wearable sensors.” Computing 103 (7): 1461–1478. https://doi.org/10.1007/s00607-021-00928-8.
Gatti, U. C., G. C. Migliaccio, S. M. Bogus, and S. Schneider. 2014. “An exploratory study of the relationship between construction workforce physical strain and task level productivity.” Construct. Manage. Econ. 32 (6): 548–564. https://doi.org/10.1080/01446193.2013.831463.
Golparvar-Fard, M., A. Heydarian, and J. C. Niebles. 2013. “Vision-based action recognition of earthmoving equipment using spatio-temporal features and support vector machine classifiers.” Adv. Eng. Inf. 27 (4): 652–663. https://doi.org/10.1016/j.aei.2013.09.001.
Jahanbanifar, S., and R. Akhavian. 2018. “Evaluation of wearable sensors to quantify construction workers muscle force: An ergonomic analysis.” In Proc., Winter Simulation Conf., 3921–3929. New York: IEEE.
Joshua, L., and K. Varghese. 2011. “Accelerometer-based activity recognition in construction.” J. Comput. Civ. Eng. 25 (5): 370–379. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000097.
Joshua, L., and K. Varghese. 2014. “Automated recognition of construction labour activity using accelerometers in field situations.” Int. J. Prod. Perform. Manage. 63 (7): 841–862. https://doi.org/10.1108/IJPPM-05-2013-0099.
Kim, H., C. R. Ahn, D. Engelhaupt, and S. Lee. 2018a. “Application of dynamic time warping to the recognition of mixed equipment activities in cycle time measurement.” Autom. Constr. 87 (Mar): 225–234. https://doi.org/10.1016/j.autcon.2017.12.014.
Kim, H., C. R. Ahn, T. L. Stentz, and H. Jebelli. 2018b. “Assessing the effects of slippery steel beam coatings to ironworkers’ gait stability.” Appl. Ergon. 68 (Apr): 72–79. https://doi.org/10.1016/j.apergo.2017.11.003.
Kim, H., Y. Ham, W. Kim, S. Park, and H. Kim. 2019. “Vision-based nonintrusive context documentation for earthmoving productivity simulation.” Autom. Constr. 102 (Jun): 135–147. https://doi.org/10.1016/j.autcon.2019.02.006.
Lim, T.-K., S.-M. Park, H.-C. Lee, and D.-E. Lee. 2016. “Artificial neural network–based slip-trip classifier using smart sensor for construction workplace.” J. Constr. Eng. Manage. 142 (2): 04015065. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001049.
Luo, H., C. Xiong, W. Fang, P. E. Love, B. Zhang, and X. Ouyang. 2018. “Convolutional neural networks: Computer vision-based workforce activity assessment in construction.” Autom. Constr. 94 (Oct): 282–289. https://doi.org/10.1016/j.autcon.2018.06.007.
Mutegeki, R., and D. S. Han. 2020. “A CNN-LSTM approach to human activity recognition.” In Proc., 2020 Int. Conf. on Artificial Intelligence in Information and Communication, ICAIIC 2020, 362–366. New York: IEEE.
Ordóñez, F. J., and D. Roggen. 2016. “Deep convolutional and LSTM recurrent neural networks for multimodal wearable activity recognition.” Sensors 16 (1): 115. https://doi.org/10.3390/s16010115.
Rahman, M. A. I. 2018. “Demo: WaDa—An android smart watch app for sensor data collection.” In UbiComp '18: Proc., 2018 ACM Int. Joint Conf. and 2018 Int. Symp. on Pervasive and Ubiquitous Computing and Wearable Computers. New York: Association for Computing Machinery.
Rashid, K. M., and J. Louis. 2020. “Automated activity identification for construction equipment using motion data from articulated members.” Front. Built Environ. 5 (Jan): 144. https://doi.org/10.3389/fbuil.2019.00144.
Ravi, N., N. Dandekar, P. Mysore, and M. L. Littman. 2005. “Activity recognition from accelerometer data.” In Vol. 3 of Proc., National Conf. on Artificial Intelligence, 1541–1546. Washington, DC: American Association for Artificial Intelligence.
Ryu, J., J. Seo, H. Jebelli, and S. Lee. 2019. “Automated action recognition using an accelerometer-embedded wristband-type activity tracker.” J. Constr. Eng. Manage. 145 (1): 04018114. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001579.
Sanhudo, L., D. Calvetti, J. P. Martins, N. M. Ramos, P. Meda, M. C. Goncalves, and H. Sousa. 2021. “Activity classification using accelerometers and machine learning for complex construction worker activities.” J. Build. Eng. 35 (Mar): 102001. https://doi.org/10.1016/j.jobe.2020.102001.
Seo, J., S. Han, S. Lee, and H. Kim. 2015. “Computer vision techniques for construction safety and health monitoring.” Adv. Eng. Inf. 29 (2): 239–251. https://doi.org/10.1016/j.aei.2015.02.001.
Sherafat, B., C. R. Ahn, R. Akhavian, A. H. Behzadan, M. Golparvar-Fard, H. Kim, Y. C. Lee, A. Rashidi, and E. R. Azar. 2020. “Automated methods for activity recognition of construction workers and equipment: State-of-the-art review.” J. Constr. Eng. Manage. 146 (6): 03120002. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001843.
Valero, E., A. Sivanathan, F. Bosché, and M. Abdel-Wahab. 2017. “Analysis of construction trade worker body motions using a wearable and wireless motion sensor network.” Autom. Constr. 83 (Nov): 48–55. https://doi.org/10.1016/j.autcon.2017.08.001.
Widiputra, H., A. Mailangkay, and E. Gautama. 2021. “Multivariate CNN-LSTM model for multiple parallel financial time-series prediction.” Complexity 2021 (1): 9903518. https://doi.org/10.1155/2021/9903518.
Xia, K., J. Huang, and H. Wang. 2020. “LSTM-CNN architecture for human activity recognition.” IEEE Access 8 (Mar): 56855–56866. https://doi.org/10.1109/ACCESS.2020.2982225.
Yang, C. C., and Y. L. Hsu. 2010. “A review of accelerometry-based wearable motion detectors for physical activity monitoring.” Sensors 10 (8): 7772–7788. https://doi.org/10.3390/s100807772.
Yang, K., C. R. Ahn, M. C. Vuran, and S. S. Aria. 2016. “Semi-supervised near-miss fall detection for ironworkers with a wearable inertial measurement unit.” Autom. Constr. 68 (Aug): 194–202. https://doi.org/10.1016/j.autcon.2016.04.007.
Yang, Z., Y. Yuan, M. Zhang, X. Zhao, and B. Tian. 2019. “Assessment of construction workers’ labor intensity based on wearable smartphone system.” J. Constr. Eng. Manage. 145 (7): 04019039. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001666.

Information & Authors

Information

Published In

Go to Journal of Construction Engineering and Management
Journal of Construction Engineering and Management
Volume 150Issue 12December 2024

History

Received: Oct 19, 2023
Accepted: Jun 6, 2024
Published online: Oct 9, 2024
Published in print: Dec 1, 2024
Discussion open until: Mar 9, 2025

Permissions

Request permissions for this article.

Authors

Affiliations

Ph.D. Scholar, Dept. of Mechanical Engineering, National Institute of Technology Patna, Bihar 800005, India; Assistant Chief Engineer, Projects & Development India Limited, Noida 201301, India (corresponding author). ORCID: https://orcid.org/0000-0002-9291-6635. Email: [email protected]
Assistant Professor, Dept. of Mechanical Engineering, National Institute of Technology Patna, Bihar 800005, India. Email: [email protected]
Vimal K. E. K. [email protected]
Assistant Professor, Dept. of Production Engineering, National Institute of Technology Tiruchirappalli, Tamil Nadu 620015, India. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share