Enhanced Situational Awareness in Worker-Robot Interaction in Construction: Assessing the Role of Visual Cues
Publication: Construction Research Congress 2022
ABSTRACT
Human-robot collaboration (HRC) in construction primarily aims at supporting human workers in performing demanding tasks. In human-robot teams, maintaining proper situational awareness is a key element of successful collaboration. In this context, one effective approach is to incorporate visual cues (i.e., explicit visual aids) into HRC through novel visualization technologies. To that end, this study investigates the impact of visual cues on workers’ situational awareness during HRC at construction jobsites. Visual cues, mainly related to the safe distance of working with autonomous robots, were embedded into an immersive virtual environment. An experiment then was conducted in which participants performed a brick-laying task in collaboration with an autonomous material lift enhancer robot with and without provided visual cues. Participants’ situational awareness in each condition was measured using electroencephalogram (EEG) signals during the task performance. Beta-band activity in specific brain regions was used as an indicator of situational awareness status. The Mann-Whitney U test revealed statistically significant differences between Beta activity in two experimental conditions, indicating higher situational awareness during HRC with visual cues. The findings of this study can pave the path toward effective utilization of visual cues in future human-robot collaboration at construction.
Get full access to this article
View all available purchase options and get full access to this chapter.
REFERENCES
Baumgartner, T., Valko, L., Esslen, M., and Jäncke, L. (2006). “Neural Correlate of Spatial Presence in an Arousing and Noninteractive Virtual Reality: An EEG and Psychophysiology Study.” CyberPsychology & Behavior, 9(1), 30–45.
Berka, C., Levendowski, D. J., Davis, G., Whitmoyer, M., Hale, K., and Fuchs, K. (2006). “Objective measures of situational awareness using neurophysiology technology.” Augmented Cognition: Past, Present and Future, 145–154.
Biocca, F., Inoue, Y., Lee, A., Polinsky, H., and Tang, A. (2002). “Visual cues and virtual touch: Role of visual stimuli and intersensory integration in cross-modal haptic illusions and the sense of presence.” Proceedings of presence, Citeseer, 410–428.
Cai, J., Zhang, Y., and Cai, H. (2019). “Two-step long short-term memory method for identifying construction activities through positional and attentional cues.” Automation in Construction, 106, 102886.
Cope, A. C., Bezemer, J., Kneebone, R., and Lingard, L. (2015). “‘You see?’ Teaching and learning how to interpret visual cues during surgery.” Medical Education, 49(11), 1103–1116.
Friedman, N., Fekete, T., Gal, K., and Shriki, O. (2019). “EEG-Based Prediction of Cognitive Load in Intelligence Tests.” Frontiers in Human Neuroscience, 13.
Habibnezhad, M., Shayesteh, S., Jebelli, H., Puckett, J., and Stentz, T. (2021). “Comparison of ironworker’s fall risk assessment systems using an immersive biofeedback simulator.” Automation in Construction, 122, 103471.
Hoffman, D. D. (2000). Visual intelligence: How we create what we see. WW Norton & Company.
Hu, H. H., Gooch, A. A., Thompson, W. B., Smits, B. E., Rieser, J. J., and Shirley, P. (2000). “Visual cues for imminent object contact in realistic virtual environments.” Proceedings Visualization 2000. VIS 2000 (Cat. No.00CH37145), IEEE, 179–185.
Jebelli, H., Choi, B., and Lee, S. (2019). “Application of Wearable Biosensors to Construction Sites. I: Assessing Workers’ Stress.” Journal of Construction Engineering and Management, 145(12), 04019079.
Jebelli, H., Hwang, S., and Lee, S. (2018). “EEG-based workers’ stress recognition at construction sites.” Automation in Construction, 93, 315–324.
Kalpagam Ganesan, R., Rathore, Y. K., Ross, H. M., and Ben Amor, H. (2018). “Better Teaming Through Visual Cues: How Projecting Imagery in a Workspace Can Improve Human-Robot Collaboration.” IEEE Robotics & Automation Magazine, 25(2), 59–71.
Kästle, J. L., Anvari, B., Krol, J., and Wurdemann, H. A. (2021). “Correlation between Situational Awareness and EEG signals.” Neurocomputing, 432, 70–79.
Lee, S., and Kim, G. J. (2008). “Effects of visual cues and sustained attention on spatial presence in virtual environments based on spatial and object distinction.” Interacting with Computers, 20(4–5), 491–502.
Li, X., Yi, W., Chi, H.-L., Wang, X., and Chan, A. P. C. (2018). “A critical review of virtual and augmented reality (VR/AR) applications in construction safety.” Automation in Construction, 86, 150–162.
Liu, Y., Habibnezhad, M., Shayesteh, S., Jebelli, H., and Lee, S. (2021). “Paving the Way for Future EEG Studies in Construction: Dependent Component Analysis for Automatic Ocular Artifact Removal from Brainwave Signals.” Journal of Construction Engineering and Management, 147(8), 04021087.
Loomis, J. M., and Knapp, J. M. (2003). “Visual perception of egocentric distance in real and virtual environments.” Virtual and adaptive environments, 11, 21–46.
Matsas, E., Vosniakos, G.-C., and Batras, D. (2018). “Prototyping proactive and adaptive techniques for human-robot collaboration in manufacturing using virtual reality.” Robotics and Computer-Integrated Manufacturing, 50, 168–180.
Nickles, G. M., Melloy, B. J., and Gramopadhye, A. K. (2003). “A comparison of three levels of training designed to promote systematic search behavior in visual inspection.” International Journal of Industrial Ergonomics, 32(5), 331–339.
Paes, D., and Irizarry, J. (2019). “The Relevance of Visual Cues in Immersive Environments: Does Pictorial Realism Matter?” Computing in Civil Engineering 2019, American Society of Civil Engineers, Reston, VA, 25–31.
Palmer, S. E. (2003). “Visual Perception of Objects.” Handbook of Psychology, John Wiley & Sons, Inc., Hoboken, NJ, USA.
Pérez, L., Diez, E., Usamentiaga, R., and García, D. F. (2019). “Industrial robot control and operator training using virtual reality interfaces.” Computers in Industry, Elsevier, 109, 114–120.
Rabie, A., and Handmann, U. (2011). Fusion of Audio- and Visual Cues for Real-Life Emotional Human Robot Interaction. 346–355.
Renner, R. S., Velichkovsky, B. M., and Helmert, J. R. (2013). “The perception of egocentric distances in virtual environments - A review.” ACM Computing Surveys, 46(2), 1–40.
Rosa, N., Hürst, W., Vos, W., and Werkhoven, P. (2015). “The Influence of Visual Cues on Passive Tactile Sensations in a Multimodal Immersive Virtual Environment.” Proceedings of the 2015 ACM on International Conference on Multimodal Interaction, ACM, New York, NY, USA, 327–334.
Shayesteh, S., and Jebelli, H. (2021). “Investigating the Impact of Construction Robots Autonomy Level on Workers’ Cognitive Load.” 2021 Canadian Society for Civil Engineering Annual Conference, CSCE 2021.
Shen, Z., Elibol, A., and Chong, N. Y. (2020). “Understanding nonverbal communication cues of human personality traits in human-robot interaction.” IEEE/CAA Journal of Automatica Sinica, 7(6), 1465–1477.
Jung, T.-P., Makeig, S., Stensmo, M., and Sejnowski, T. J. (1997). “Estimating alertness from the EEG power spectrum.” IEEE Transactions on Biomedical Engineering, 44(1), 60–69.
Wang, X., and Dunston, P. S. (2007). “Design, strategies, and issues towards an augmented reality-based construction training platform.” Journal of information technology in construction (ITcon), Citeseer, 12(25), 363–380.
Weistroffer, V., Paljic, A., Callebert, L., and Fuchs, P. (2013). “A Methodology to Assess the Acceptability of Human-Robot Collaboration Using Virtual Reality.” Proceedings of the 19th ACM Symposium on Virtual Reality Software and Technology - VRST ’13, ACM Press, New York, New York, USA.
You, S., Kim, J.-H., Lee, S., Kamat, V., and Robert, L. P. (2018). “Enhancing perceived safety in human–robot collaborative construction using immersive virtual environments.” Automation in Construction, 96, 161–170.
Information & Authors
Information
Published In
History
Published online: Mar 7, 2022
Authors
Metrics & Citations
Metrics
Citations
Download citation
If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.