Chapter
Mar 18, 2024

Analyzing Human Visual Attention in Human-Robot Collaborative Construction Tasks

Publication: Construction Research Congress 2024

ABSTRACT

Human-Robot Collaboration (HRC) is a promising approach to relieve workers from repetitive and physically demanding tasks and improve safety and productivity in construction. It is critical for robots to understand worker intention in order to adapt their motion to facilitate smooth HRC. Evidence has shown that visual attention reveals human intention. However, it is still unclear how visual attention is distributed in human-robot collaborative construction tasks. In this study, a pilot experiment was conducted to examine human visual attention in a wood assembly task with the assistance of a collaborative robot. A mobile eye tracker was used to collect participants’ gaze movements. Data were validated and processed in terms of various metrics to analyze visual attention patterns. It is found that construction workers’ visual attention is related to the detailed process of the task—around 30% of the eye gaze is located at the connector areas and the design drawing area, which is primarily relevant to their task. Furthermore, workers’ attention could be affected by the movement of the robot, with their gaze following the path of robot arm and gripper during the collaboration. The findings can stimulate further research into attention-aware HRC for intelligent construction.

Get full access to this article

View all available purchase options and get full access to this chapter.

REFERENCES

Admoni, H., and B. Scassellati. 2017. “Social Eye Gaze in Human-Robot Interaction: A Review.” Journal of Human-Robot Interaction 6 (1): 25. https://doi.org/10.5898/JHRI.6.1.Admoni.
Bridgeman, B., D. Hendry, and L. Stark. 1975. “Failure to Detect Displacement of the Visual World during Saccadic Eye Movements.” Vision Research 15 (6): 719–22. https://doi.org/10.1016/0042-6989(75)90290-4.
Cai, J., Y. Zhang, L. Yang, H. Cai, and S. Li. 2020. “A Context-Augmented Deep Learning Approach for Worker Trajectory Prediction on Unstructured and Dynamic Construction Sites.” Advanced Engineering Informatics 46 (October): 101173. https://doi.org/10.1016/j.aei.2020.101173.
Calvetti, D., P. Mêda, M. Chichorro Gonçalves, and H. Sousa. 2020. “Worker 4.0: The Future of Sensored Construction Sites.” Buildings 10 (10): 169. https://doi.org/10.3390/buildings10100169.
Chadalavada, R. T., H. Andreasson, M. Schindler, R. Palm, and A. J. Lilienthal. 2020. “Bi-Directional Navigation Intent Communication Using Spatial Augmented Reality and Eye-Tracking Glasses for Improved Safety in Human–Robot Interaction.” Robotics and Computer-Integrated Manufacturing 61 (February): 101830. https://doi.org/10.1016/j.rcim.2019.101830.
Clarion Energy Content Directors. 2007. “Five Ways to Help End the Craft Labor Shortage.” Power Engineering (blog). 2007. https://www.power-eng.com/news/five-ways-to-help-end-the-craft-labor-shortage/.
Fischer, B., and B. Breitmeyer. 1987. “Mechanisms of Visual Attention Revealed by Saccadic Eye Movements.” Neuropsychologia 25 (1, Part 1): 73–83. https://doi.org/10.1016/0028-3932(87)90044-3.
Geraci, B. 2021. “Use Technology to Attract the next Generation of Construction Workers.” Motive. 2021. https://gomotive.com/blog/technology-next-generation-construction-workers/.
Hasanzadeh, S., B. Esmaeili, and M. D. Dodd. 2017. “Impact of Construction Workers’ Hazard Identification Skills on Their Visual Attention.” Journal of Construction Engineering and Management 143 (10): 04017070. https://doi.org/10.1061/(ASCE)CO.1943-7862.0001373.
Ivaldi, S., S. Lefort, J. Peters, M. Chetouani, J. Provasi, and E. Zibetti. 2017. “Towards Engagement Models That Consider Individual Factors in HRI: On the Relation of Extroversion and Negative Attitude Towards Robots to Gaze and Speech During a Human–Robot Assembly Task.” International Journal of Social Robotics 9 (1): 63–86. https://doi.org/10.1007/s12369-016-0357-8.
Jacob, R. J. K., and K. S. Karn. 2003. “Commentary on Section 4 - Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises.” In The Mind’s Eye, edited by J. Hyönä, R. Radach, and H. Deubel, 573–605. Amsterdam: North-Holland. https://doi.org/10.1016/B978-044451020-4/50031-1.
Kassner, M., W. Patera, and A. Bulling. 2014. “Pupil: An Open Source Platform for Pervasive Eye Tracking and Mobile Gaze-Based Interaction.” arXiv. https://doi.org/10.48550/arXiv.1405.0006.
Kim, S., M. Peavy, P.-C. Huang, and K. Kim. 2021. “Development of BIM-Integrated Construction Robot Task Planning and Simulation System.” Automation in Construction 127 (July): 103720. https://doi.org/10.1016/j.autcon.2021.103720.
Liang, C.-J., V. R. Kamat, and C. C. Menassa. 2020. “Teaching Robots to Perform Quasi-Repetitive Construction Tasks through Human Demonstration.” Automation in Construction 120 (December): 103370. https://doi.org/10.1016/j.autcon.2020.103370.
Liu, Y., and H. Jebelli. 2022. “Human-Robot Co-Adaptation in Construction: Bio-Signal Based Control of Bricklaying Robots,” May, 304–12. https://doi.org/10.1061/9780784483893.038.
Lundeen, K. M., V. R. Kamat, C. C. Menassa, and W. McGee. 2019. “Autonomous Motion Planning and Task Execution in Geometrically Adaptive Robotized Construction Work.” Automation in Construction 100 (April): 24–45. https://doi.org/10.1016/j.autcon.2018.12.020.
MacInnes, J. J., S. Iqbal, J. Pearson, and E. N. Johnson. 2018. “Mobile Gaze Mapping: A Python Package for Mapping Mobile Gaze Data to a Fixed Target Stimulus.” Journal of Open Source Software 3 (31): 984. https://doi.org/10.21105/joss.00984.
Moniri, M. M., F. Andres Espinosa Valcarcel, D. Merkel, and D. Sonntag. 2016. “Human Gaze and Focus-of-Attention in Dual Reality Human-Robot Collaboration.” In 2016 12th International Conference on Intelligent Environments (IE), 238–41. https://doi.org/10.1109/IE.2016.54.
Neisser, U., and R. Becklen. 1975. “Selective Looking: Attending to Visually Specified Events.” Cognitive Psychology 7 (4): 480–94. https://doi.org/10.1016/0010-0285(75)90019-5.
Okishiba, S., R. Fukui, M. Takagi, H. Azumi, S. Warisawa, R. Togashi, H. Kitaoka, and T. Ooi. 2019. “Tablet Interface for Direct Vision Teleoperation of an Excavator for Urban Construction Work.” Automation in Construction 102 (June): 17–26. https://doi.org/10.1016/j.autcon.2019.02.003.
Olsen, D., M. Tatum, and C. Defnall. 2012. How Industrial Contractors Are Handling Skilled Labor Shortages in the United States.
Palinko, O., F. Rea, G. Sandini, and A. Sciutti. 2016. “Robot Reading Human Gaze: Why Eye Tracking Is Better than Head Tracking for Human-Robot Collaboration.” In 2016 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), 5048–54. https://doi.org/10.1109/IROS.2016.7759741.
Pashler, H. 2016. Attention. Psychology Press.
Rayner, K. 1977. “Visual Attention in Reading: Eye Movements Reflect Cognitive Processes.” Memory & Cognition 5 (4): 443–48. https://doi.org/10.3758/BF03197383.
Rensink, R. A., J. Kevin O’Regan, and J. J. Clark. 1997. “To See or Not to See: The Need for Attention to Perceive Changes in Scenes.” Psychological Science 8 (5): 368–73. https://doi.org/10.1111/j.1467-9280.1997.tb00427.x.
Shi, Y., J. Du, and E. Ragan. 2020. “Review Visual Attention and Spatial Memory in Building Inspection: Toward a Cognition-Driven Information System.” Advanced Engineering Informatics 44 (April): 101061. https://doi.org/10.1016/j.aei.2020.101061.
Simic, N. 2023. “Appealing to the next Generation of Construction Workers.” Bridgit. 2023. https://gobridgit.com/blog/appealing-to-the-next-generation-of-construction-workers/.
The Associated General Contractors of America. 2018. “Eighty percent of contractors report difficulty finding qualified craft workers to hire as association calls for measures to rebuild workforce | Associated General Contractors of America.” 2018. https://www.agc.org/news/2018/08/29/eighty-percent-contractors-report-difficulty-finding-qualified-craft-workers-hire-0.
Treisman, A. M., and G. Gelade. 1980. “A Feature-Integration Theory of Attention.” Cognitive Psychology 12 (1): 97–136. https://doi.org/10.1016/0010-0285(80)90005-5.
Vijayakumar, S., J. Conradt, T. Shibata, and S. Schaal. 2001. “Overt Visual Attention for a Humanoid Robot.” In Proceedings 2001 IEEE/RSJ International Conference on Intelligent Robots and Systems. Expanding the Societal Role of Robotics in the the Next Millennium (Cat. No.01CH37180), 4:2332–37 vol.4. https://doi.org/10.1109/IROS.2001.976418.
Wang, X., D. Veeramani, and Z. Zhu. 2023. “Gaze-Aware Hand Gesture Recognition for Intelligent Construction.” Engineering Applications of Artificial Intelligence 123 (August): 106179. https://doi.org/10.1016/j.engappai.2023.106179.
Zhang, Q., M. Liang, A. P. C. Chan, and P.-C. Liao. 2023. “Visual Attention and Cognitive Process in Construction Hazard Recognition: Study of Fixation-Related Potential.” Automation in Construction 148 (April): 104756. https://doi.org/10.1016/j.autcon.2023.104756.

Information & Authors

Information

Published In

Go to Construction Research Congress 2024
Construction Research Congress 2024
Pages: 856 - 865

History

Published online: Mar 18, 2024

Permissions

Request permissions for this article.

ASCE Technical Topics:

Authors

Affiliations

Xiaoyun Liang, S.M.ASCE [email protected]
1Ph.D. Student, School of Civil & Environmental Engineering, and Construction Management, Univ. of Texas at San Antonio, San Antonio, TX. Email: [email protected]
Jiannan Cai, Ph.D., A.M.ASCE [email protected]
2Assistant Professor, School of Civil & Environmental Engineering, and Construction Management, Univ. of Texas at San Antonio, San Antonio, TX. Email: [email protected]
Yuqing Hu, Ph.D., A.M.ASCE [email protected]
3Assistant Professor, Dept. of Architectural Engineering, Pennsylvania State Univ., University Park, PA. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$276.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Paper
$35.00
Add to cart
Buy E-book
$276.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share