Chapter
Mar 7, 2022

Intention Estimation in Physical Human-Robot Interaction in Construction: Empowering Robots to Gauge Workers’ Posture

Publication: Construction Research Congress 2022

ABSTRACT

The implementation of robots in the construction industry is poised to increase and improve. Human-robot collaboration (HRC) leverages dexterous workers and tireless robots to execute various complicated construction operations. However, physical aspects of HRC may create timing and collision hazards at construction sites. One of the main issues is that a construction robot cannot understand a worker’s moving intention. In this regard, the worker’s posture is an essential indicator for avoiding collisions. Therefore, to allow workers and robots to collaborate safely, robots need to be empowered to assess workers’ postures and movement intentions. To address this need, this study leverages computer vision techniques to enable collaborative robots to estimate worker positions and poses. The proposed approach employs a multi-stage Convolutional Neural Network to first identify workers’ joints. Subsequently, the network will assemble the results into full-body postures, using the Part Affinity Fields technique, to allow the robot to understand worker poses. To examine the feasibility of this approach, an experiment was designed in which four subjects were required to perform bricklaying tasks in collaboration with a masonry robot. The results reveal that the proposed approach enables robots to estimate subjects’ postures with a 63.3% precision using a metric of the percentage of correct key points. The findings pave the way to enable collaborative robots to understand workers’ intentions when moving, supporting safe HRC at construction sites.

Get full access to this article

View all available purchase options and get full access to this chapter.

REFERENCES

Bauer, A., Wollherr, D., and Buss, M. (2008). “Human-robot Collaboration: A Survey.” International Journal of Humanoid Robotics, 05(01), 47–66.
Bock, T., and Linner, T. (2016). “Single-Task Construction Robots by Category.” Construction Robots, Cambridge University Press, Cambridge, 14–290.
Cao, Z., Hidalgo, G., Simon, T., Wei, S. E., and Sheikh, Y. (2021). “OpenPose: Realtime Multi-Person 2D Pose Estimation Using Part Affinity Fields.” IEEE Transactions on Pattern Analysis and Machine Intelligence, 43(1), 172–186.
Cao, Z., Simon, T., Wei, S.-E., and Sheikh, Y. (2017). “Realtime Multi-person 2D Pose Estimation Using Part Affinity Fields.” 2017 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, 1302–1310.
Deng, J., Dong, W., Socher, R., Li, L.-J., Li, K., and Li, F.-F. (2009). “ImageNet: A large-scale hierarchical image database.” 2009 IEEE Conference on Computer Vision and Pattern Recognition, IEEE, 248–255.
Erden, M. S., and Marić, B. (2011). “Assisting manual welding with robot.” Robotics and Computer-Integrated Manufacturing, 27(4), 818–828.
Fang, Z., and López, A. M. (2018). “Is the pedestrian going to cross? Answering by 2D pose estimation.” arXiv, (Iv), 1271–1276.
Fang, Z., and López, A. M. (2020). “Intention Recognition of Pedestrians and Cyclists by 2D Pose Estimation.” IEEE Transactions on Intelligent Transportation Systems, 21(11), 4773–4783.
Insafutdinov, E., Pishchulin, L., Andres, B., Andriluka, M., and Schiele, B. (2016). “DeeperCut: A Deeper, Stronger, and Faster Multi-person Pose Estimation Model.” European Conference on Computer Vision Workshops, Lecture Notes in Computer Science, B. Leibe, J. Matas, N. Sebe, and M. Welling, eds., Springer International Publishing, Cham, 34–50.
Iqbal, U., and Gall, J. (2016). “Multi-person Pose Estimation with Local Joint-to-Person Associations.” European Conference on Computer Vision Workshops, 627–642.
Li, Y., and Ge, S. S. (2014). “Human–Robot Collaboration Based on Motion Intention Estimation.” IEEE/ASME Transactions on Mechatronics, IEEE, 19(3), 1007–1014.
Lin, T.-Y., Maire, M., Belongie, S., Bourdev, L., Girshick, R., Hays, J., Perona, P., Ramanan, D., Zitnick, C. L., and Dollár, P. (2014). “Microsoft COCO: Common Objects in Context.” Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 8693 LNCS(PART 5), 740–755.
Liu, M., Han, S., and Lee, S. (2017). “Potential of Convolutional Neural Network-Based 2D Human Pose Estimation for On-Site Activity Analysis of Construction Workers.” Computing in Civil Engineering 2017, American Society of Civil Engineers, Reston, VA, 141–149.
Liu, Y., Habibnezhad, M., and Jebelli, H. (2021a). “Brain-computer interface for hands-free teleoperation of construction robots.” Automation in Construction, Elsevier B.V., 123(November 2020), 103523.
Liu, Y., Habibnezhad, M., and Jebelli, H. (2021b). “Brainwave-driven human-robot collaboration in construction.” Automation in Construction, Elsevier B.V., 124(January), 103556.
Liu, Y., Habibnezhad, M., Jebelli, H., Asadi, S., and Lee, S. (2020). “Ocular Artifacts Reduction in EEG Signals Acquired at Construction Sites by Applying a Dependent Component Analysis (DCA).” Construction Research Congress 2020, American Society of Civil Engineers, Reston, VA, 1281–1289.
Liu, Y., Habibnezhad, M., Shayesteh, S., Jebelli, H., and Lee, S. (2021c). “Paving the Way for Future EEG Studies in Construction: Dependent Component Analysis for Automatic Ocular Artifact Removal from Brainwave Signals.” Journal of Construction Engineering and Management, 147(8), 04021087.
Liu, Y. K., and Zhang, Y. M. (2015). “Toward Welding Robot With Human Knowledge: A Remotely-Controlled Approach.” IEEE Transactions on Automation Science and Engineering, IEEE, 12(2), 769–774.
Petters, S., and Belden, R. (2014). “SAM, the robotic bricklayer.” SMART/Dynamics of Masonry, 1, 10–14.
Roberts, D., Torres Calderon, W., Tang, S., and Golparvar-Fard, M. (2020). “Vision-Based Construction Worker Activity Analysis Informed by Body Posture.” Journal of Computing in Civil Engineering, 34(4), 04020017.
Robla-Gomez, S., Becerra, V. M., Llata, J. R., Gonzalez-Sarabia, E., Torre-Ferrero, C., and Perez-Oria, J. (2017). “Working Together: A Review on Safe Human-Robot Collaboration in Industrial Environments.” IEEE Access, 5, 26754–26773.
Schaaff, K., and Schultz, T. (2009). “Towards an EEG-based emotion recognizer for humanoid robots.” RO-MAN 2009 - The 18th IEEE International Symposium on Robot and Human Interactive Communication, IEEE, 792–796.
Shelhamer, E., Long, J., and Darrell, T. (2016). “Fully Convolutional Networks for Semantic Segmentation.” 2019 IEEE/CVF International Conference on Computer Vision Workshop (ICCVW), IEEE, 847–856.
Song, Z., Yin, Z., Yuan, Z., Zhang, C., Chi, W., Ling, Y., and Zhang, S. (2020). “Attention-Oriented Action Recognition for Real-Time Human-Robot Interaction.” arXiv.
Thomaz, A., Hoffman, G., and Cakmak, M. (2016). “Computational Human-Robot Interaction.” Foundations and Trends in Robotics, 4(2–3), 104–223.
Villani, V., Pini, F., Leali, F., and Secchi, C. (2018). “Survey on human–robot collaboration in industrial settings: Safety, intuitive interfaces and applications.” Mechatronics, Elsevier, 55(June 2017), 248–266.
Vysocky, A., and Novak, P. (2016). “HUMAN – ROBOT COLLABORATION IN INDUSTRY.” MM Science Journal, 2016(02), 903–906. https://doi.org/10.17973/MMSJ.2016_06_201611.
Wei, S.-E., Ramakrishna, V., Kanade, T., and Sheikh, Y. (2016). “Convolutional Pose Machines.” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), IEEE, 4724–4732.
Yang, Y., and Ramanan, D. (2013). “Articulated human detection with flexible mixtures of parts.” IEEE Transactions on Pattern Analysis and Machine Intelligence, IEEE, 35(12), 2878–2890.
Zacharias, F., Schlette, C., Schmidt, F., Borst, C., Rossmann, J., and Hirzinger, G. (2011). “Making planned paths look more human-like in humanoid robot manipulation planning.” Proceedings - IEEE International Conference on Robotics and Automation, IEEE, 1192–1198.

Information & Authors

Information

Published In

Go to Construction Research Congress 2022
Construction Research Congress 2022
Pages: 621 - 630

History

Published online: Mar 7, 2022

Permissions

Request permissions for this article.

Authors

Affiliations

1Ph.D. Student, Dept. of Architectural Engineering, Pennsylvania State Univ., University Park, PA. Email: [email protected]
Houtan Jebelli [email protected]
2Assistant Professor, Dept. of Architectural Engineering, Pennsylvania State Univ., University Park, PA. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$288.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$288.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share