Technical Papers
Apr 28, 2023

Effects of Spatial Characteristics on the Human–Robot Communication Using Deictic Gesture in Construction

Publication: Journal of Construction Engineering and Management
Volume 149, Issue 7

Abstract

Construction robots are expected to frequently communicate in situ improvisations with human workers to adapt and change their workflow and methods. One way to achieve this is through deictic gestures that are one of the most effective forms of human–robot interaction (HRI) in delivering spatial information. Nevertheless, the limited coverage of deictic gestures in large-scale environments poses some challenges for both humans and robots in leveraging such techniques for HRI in construction. To identify the feasibility of deictic gestures in the construction domain and find applicable solutions for improving performance, this study aims to extend current knowledge on the performance in communicating positional information using deictic gestures by investigating the effects of spatial characteristics on spatial referencing, focusing on the target configuration, target distance, and relative position of human and robot. We observed that the recognition and estimation of deictic gestures were affected by the target plane, target position, and the target layout and that the robot performance was significantly reduced as the distance between the human and robot increased. The findings of this study demonstrate the challenges in spatial referencing within a large-scale environment and highlight the need for bidirectional communication in HRI.

Get full access to this article

View all available purchase options and get full access to this article.

Data Availability Statement

All data, models, or code that support the findings of this study are available from the corresponding author upon reasonable request.

Acknowledgments

This study was supported by the New Faculty Startup Fund and the Institute of Construction and Environmental Engineering (ICEE) at Seoul National University (SNU). Any opinions, findings, conclusions, or recommendations expressed in this paper are those of the authors and do not necessarily reflect the views of ICEE or SNU.

References

Adami, P., P. B. Rodrigues, P. J. Woods, B. Becerik-Gerber, L. Soibelman, Y. Copur-Gencturk, and G. Lucas. 2022. “Impact of VR-based training on human–robot interaction for remote operating construction robots.” J. Comput. Civ. Eng. 36 (3): 04022006. https://doi.org/10.1061/(ASCE)CP.1943-5487.0001016.
Alibali, M. W. 2005. “Gesture in spatial cognition: Expressing, communicating, and thinking about spatial information.” Spatial Cognit. Comput. 5 (4): 307–331. https://doi.org/10.1207/s15427633scc0504_2.
Ameri, S. K., M. Kim, I. A. Kuang, W. K. Perera, M. Alshiekh, H. Jeong, U. Topcu, D. Akinwande, and N. Lu. 2018. “Imperceptible electrooculography graphene sensor system for human–robot interface.” NPJ 2D Mater. Appl. 2 (1): 19. https://doi.org/10.1038/s41699-018-0064-4.
Annem, V., P. Rajendran, S. Thakar, and S. K. Gupta. 2019. “Towards remote teleoperation of a semi-autonomous mobile manipulator system in machine tending tasks.” In Proc., ASME 2019 14th Int. Manufacturing Science and Engineering Conf., MSEC 2019. New York: ASME.
Asadi, E., B. Li, and I. M. Chen. 2018. “Pictobot: A cooperative painting robot for interior finishing of industrial developments.” IEEE Rob. Autom. Mag. 25 (2): 82–94. https://doi.org/10.1109/MRA.2018.2816972.
Bolt, R. A. 1980. “‘Put-that-there’: Voice and gesture at the graphics interface.” In Proc., 7th Annual Conf. on Computer Graphics and Interactive Techniques, 262–270. New York: Association for Computing Machinery.
Brand, D., A. Meschtscherjakov, and K. Büchele. 2016. “Pointing at the HUD: Gesture interaction using a leap motion.” In Proc., AutomotiveUI 2016—8th Int. Conf. on Automotive User Interfaces and Interactive Vehicular Applications, Adjunct Proceedings, 167–172. New York: Association for Computing Machinery.
Burke, J. L., R. R. Murphy, E. Rogers, V. J. Lumelsky, and J. Scholtz. 2004. “Final report for the DARPA/NSF interdisciplinary study on human-robot interaction.” IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 34 (2): 103–112. https://doi.org/10.1109/TSMCC.2004.826287.
Butterworth, G. 2003. “Pointing is the royal road to language for babies.” In Pointing, 17–42. East Sussex, UK: Psychology Press.
Canal, G., S. Escalera, and C. Angulo. 2016. “A real-time human-robot interaction system based on gestures for assistive scenarios.” Comput. Vision Image Understanding 149 (Aug): 65–77. https://doi.org/10.1016/j.cviu.2016.03.004.
Cao, Z., G. Hidalgo, T. Simon, S. E. Wei, and Y. Sheikh. 2021. “OpenPose: Realtime multi-person 2D pose estimation using part affinity fields.” IEEE Trans. Pattern Anal. Mach. Intell. 43 (1): 172–186. https://doi.org/10.1109/TPAMI.2019.2929257.
Carra, G., A. Argiolas, A. Bellissima, M. Niccolini, and M. Ragaglia. 2018. “Robotics in the construction industry: State of the art and future opportunities.” In Proc., 35th Int. Symp. on Automation and Robotics in Construction (ISARC), 866–873. Cambridge, UK: International Association for Automation and Robotics in Construction.
Chen, X., H. Huang, Y. Liu, J. Li, and M. Liu. 2022. “Robot for automatic waste sorting on construction sites.” Autom. Constr. 141 (Sep): 104387. https://doi.org/10.1016/j.autcon.2022.104387.
Civ Robotics. 2021. “CivDot.” Accessed December 16, 2021. https://www.civrobotics.com/product.
COBOD. 2021. “The BOD2.” Accessed December 16, 2021. https://cobod.com/solution/bod2/.
Construction Robotics. 2021. “SAM.” Accessed December 16, 2021. https://www.construction-robotics.com/sam-2/.
David, O., F. X. Russotto, M. da Silva Simoes, and Y. Measson. 2014. “Collision avoidance, virtual guides and advanced supervisory control teleoperation techniques for high-tech construction: Framework design.” Autom. Constr. 44 (Aug): 63–72. https://doi.org/10.1016/j.autcon.2014.03.020.
Dhingra, N., E. Valli, and A. Kunz. 2020. “Recognition and localisation of pointing gestures using a RGB-D camera.” In Communications in Computer and Information Science, 205–212. Cham, Switzerland: Springer.
Ding, L., W. Jiang, Y. Zhou, C. Zhou, and S. Liu. 2020. “BIM-based task-level planning for robotic brick assembly through image-based 3D modeling.” Adv. Eng. Inf. 43 (Jan): 100993. https://doi.org/10.1016/j.aei.2019.100993.
Dusty Robotics. 2021. “FieldPrinter.” Accessed December 16, 2021. https://www.dustyrobotics.com/product.
FBR (Fastbrick Robotics). 2021. “Hadrian X.” Accessed December 16, 2021. https://www.fbr.com.au/view/hadrian-x.
Feng, C., Y. Xiao, A. Willette, W. McGee, and V. R. Kamat. 2015. “Vision guided autonomous robotic assembly and as-built scanning on unstructured construction sites.” Autom. Constr. 59 (Nov): 128–138. https://doi.org/10.1016/j.autcon.2015.06.002.
Follini, C., V. Magnago, K. Freitag, M. Terzer, C. Marcher, M. Riedl, A. Giusti, and D. T. Matt. 2021. “BIM-integrated collaborative robotics for application in building construction and maintenance.” Robotics 10 (1): 1–19. https://doi.org/10.3390/robotics10010002.
Ganglbauer, M., M. Ikeda, M. Plasch, and A. Pichler. 2020. “Human in the loop online estimation of robotic speed limits for safe human robot collaboration.” Procedia Manuf. 51 (Jan): 88–94. https://doi.org/10.1016/j.promfg.2020.10.014.
Gawel, A., et al. 2019. “A fully-integrated sensing and control system for high-accuracy mobile robotic building construction.” In Proc., IEEE Int. Conf. on Intelligent Robots and Systems, 2300–2307. New York: IEEE. https://doi.org/10.1109/IROS40897.2019.8967733.
Gharbia, M., A. Chang-Richards, Y. Lu, R. Y. Zhong, and H. Li. 2020. “Robotic technologies for on-site building construction: A systematic review.” J. Build. Eng. 32 (Nov): 101554. https://doi.org/10.1016/j.jobe.2020.101584.
Grahn, S., V. Gopinath, X. V. Wang, and K. Johansen. 2018. “Exploring a model for production system design to utilize large robots in human-robot collaborative assembly cells.” Procedia Manuf. 25 (Jan): 612–619. https://doi.org/10.1016/j.promfg.2018.06.094.
Gromov, B., G. Abbate, L. M. Gambardella, and A. Giusti. 2019. “Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU.” In Proc., 2019 Int. Conf. on Robotics and Automation (ICRA), 8084–8091. New York: IEEE.
Gromov, B., L. Gambardella, and A. Giusti. 2020. “Guiding quadrotor landing with pointing gestures.” In Springer proceedings in advanced robotics, 1–14. Cham, Switzerland: Springer.
Gromov, B., L. M. Gambardella, and A. Giusti. 2018. “Robot identification and localization with pointing gestures.” In Proc., 2018 IEEE RSJ Int. Conf. on Intelligent Robots and Systems (IROS), 3921–3928. New York: IEEE.
Gustavsson, P., M. Holm, A. Syberfeldt, and L. Wang. 2018. “Human-robot collaboration—Towards new metrics for selection of communication technologies.” Procedia CIRP 72 (Jan): 123–128. https://doi.org/10.1016/j.procir.2018.03.156.
Hentout, A., M. Aouache, A. Maoudj, and I. Akli. 2019. “Human–robot interaction in industrial collaborative robotics: A literature review of the decade 2008–2017.” Adv. Rob. 33 (15–16): 764–799. https://doi.org/10.1080/01691864.2019.1636714.
Heydaryan, S., J. S. Bedolla, and G. Belingardi. 2018. “Safety design and development of a human-robot collaboration assembly process in the automotive industry.” Appl. Sci. 8 (3): 344. https://doi.org/10.3390/app8030344.
Inkulu, A. K., M. R. Bahubalendruni, and A. Dara. 2022. “Challenges and opportunities in human robot collaboration context of Industry 4.0: A state of the art review.” Ind. Robot 49 (2): 226–239. https://doi.org/10.1108/IR-04-2021-0077.
Intel. 2022. “Intel RealSense Camera 400 series product family datasheet.” Accessed March 17, 2022. https://www.intelrealsense.com/depth-camera-d435/.
ISO. 2021. Robotics: Vocabulary. ISO 8373:2021. Geneva: ISO.
Jevtić, A., G. Doisy, Y. Parmet, and Y. Edan. 2015. “Comparison of interaction modalities for mobile indoor robot guidance: Direct physical interaction, person following, and pointing control.” IEEE Trans. Hum.-Mach. Syst. 45 (6): 653–663. https://doi.org/10.1109/THMS.2015.2461683.
Jevtić, A., A. F. Valle, G. Alenyà, G. Chance, P. Caleb-Solly, S. Dogramadzi, and C. Torras. 2019. “Personalized robot assistant for support in dressing.” IEEE Trans. Cognit. Dev. Syst. 11 (3): 363–374. https://doi.org/10.1109/TCDS.2018.2817283.
Jirak, D., D. Biertimpel, M. Kerzel, and S. Wermter. 2021. “Solving visual object ambiguities when pointing: An unsupervised learning approach.” Neural Comput. Appl. 33 (7): 2297–2319. https://doi.org/10.1007/s00521-020-05109-w.
Keskin, C., A. Erkan, and L. Akarun. 2003. “Real time hand tracking and 3D gesture recognition for interactive interfaces using hmm.” In Proc., Joint Int. Conf. ICANN/ICONIP. Berlin: Springer-Verlag.
Kim, K., and Y. K. Cho. 2015. “Construction-specific spatial information reasoning in building information models.” Adv. Eng. Inf. 29 (4): 1013–1027. https://doi.org/10.1016/j.aei.2015.08.004.
Kim, S. W., B. Qin, Z. J. Chong, X. Shen, W. Liu, M. H. Ang, E. Frazzoli, and D. Rus. 2015. “Multivehicle cooperative driving using cooperative perception: Design and experimental validation.” IEEE Trans. Intell. Transp. Syst. 16 (2): 663–680. https://doi.org/10.1109/TITS.2014.2337316.
Kim, Y., H. Kim, R. Murphy, S. Lee, and C. R. Ahn. 2022. “Delegation or collaboration: Understanding different construction stakeholders’ perceptions of robotization.” J. Manage. Eng. 38 (1): 04021084. https://doi.org/10.1061/(ASCE)ME.1943-5479.0000994.
Koh, K. H., M. Farhan, K. P. C. Yeung, D. C. H. Tang, M. P. Y. Lau, P. K. Cheung, and K. W. C. Lai. 2021. “Teleoperated service robotic system for on-site surface rust removal and protection of high-rise exterior gas pipes.” Autom. Constr. 125 (May): 103609. https://doi.org/10.1016/j.autcon.2021.103609.
Kumar, P., J. Verma, and S. Prasad. 2012. “Hand data glove: A wearable real-time device for human-computer interaction.” Int. J. Adv. Sci. Technol. 43 (Jun): 15–26.
Kyjanek, O., B. Al Bahar, L. Vasey, B. Wannemacher, and A. Menges. 2019. “Implementation of an augmented reality AR workflow for human robot collaboration in timber prefabrication.” In Proc., 36th Int. Symp. on Automation and Robotics in Construction, ISARC, 1223–1230. Cambridge, UK: International Association for Automation and Robotics in Construction.
Lai, Y., C. Wang, Y. Li, S. S. Ge, and D. Huang. 2016. “3D pointing gesture recognition for human-robot interaction.” In Proc., 28th Chinese Control and Decision Conf., CCDC 2016, 4959–4964. New York: IEEE.
Lamon, E., M. Leonori, W. Kim, and A. Ajoudani. 2020. “Towards an intelligent collaborative robotic system for mixed case palletizing; towards an intelligent collaborative robotic system for mixed case palletizing.” In Proc., 2020 IEEE Int. Conf. on Robotics and Automation (ICRA). New York: IEEE.
Lei, T., Y. Rong, H. Wang, Y. Huang, and M. Li. 2020. “A review of vision-aided robotic welding.” Comput. Ind. 123 (Dec): 103326. https://doi.org/10.1016/j.compind.2020.103326.
Li, X. 2020. “Human–robot interaction based on gesture and movement recognition.” Signal Process. Image Commun. 81 (Feb): 115686. https://doi.org/10.1016/j.image.2019.115686.
Li, Y., J. Huang, F. Tian, H. A. Wang, and G. Z. Dai. 2019. “Gesture interaction in virtual reality.” Virtual Reality Intell. Hardware 1 (1): 84–112. https://doi.org/10.3724/SP.J.2096-5796.2018.0006.
Liang, C. J., V. R. Kamat, and C. C. Menassa. 2020. “Teaching robots to perform quasi-repetitive construction tasks through human demonstration.” Autom. Constr. 120 (Dec): 103370. https://doi.org/10.1016/j.autcon.2020.103370.
Liang, C.-J., V. R. Kamat, C. C. Menassa, and W. Mcgee. 2021a. “Trajectory-based skill learning for overhead construction robots using generalized cylinders with orientation.” J. Comput. Civ. Eng. 36 (2): 04021036. https://doi.org/10.1061/(ASCE)CP.1943-5487.0001004.
Liang, C.-J., X. Wang, V. R. Kamat, and C. C. Menassa. 2021b. “Human–robot collaboration in construction: Classification and research trends.” J. Constr. Eng. Manage. 147 (10): 03121006. https://doi.org/10.1061/(ASCE)CO.1943-7862.0002154.
Liu, Y., M. Habibnezhad, and H. Jebelli. 2021. “Brain-computer interface for hands-free teleoperation of construction robots.” Autom. Constr. 123 (Mar): 103523. https://doi.org/10.1016/j.autcon.2020.103523.
Lundeen, K. M., V. R. Kamat, C. C. Menassa, and W. McGee. 2017. “Scene understanding for adaptive manipulation in robotized construction work.” Autom. Constr. 82 (Oct): 16–30. https://doi.org/10.1016/j.autcon.2017.06.022.
Lundeen, K. M., V. R. Kamat, C. C. Menassa, and W. McGee. 2019. “Autonomous motion planning and task execution in geometrically adaptive robotized construction work.” Autom. Constr. 100 (Apr): 24–45. https://doi.org/10.1016/j.autcon.2018.12.020.
Makris, S., P. Karagiannis, S. Koukas, and A. S. Matthaiakis. 2016. “Augmented reality system for operator support in human–robot collaborative assembly.” CIRP Ann. 65 (1): 61–64. https://doi.org/10.1016/j.cirp.2016.04.038.
Malik, A. A., and A. Bilberg. 2019. “Developing a reference model for human–robot interaction.” Int. J. Interact. Des. Manuf. 13 (4): 1541–1547. https://doi.org/10.1007/s12008-019-00591-6.
Mayer, S., J. Reinhardt, R. Schweigert, B. Jelke, V. Schwind, K. Wolf, and N. Henze. 2020. “Improving humans’ ability to interpret deictic gestures in virtual reality.” In Proc., Conf. on Human Factors in Computing Systems. New York: Association for Computing Machinery.
Mayer, S., V. Schwind, R. Schweigert, and N. Henze. 2018. “The effect of offset correction and cursor on mid-air pointing in real and virtual environments.” In Proc., Conf. on Human Factors in Computing Systems. New York: Association for Computing Machinery.
Mayer, S., K. Wolf, S. Schneegass, and N. Henze. 2015. “Modeling distant pointing for compensating systematic displacements.” In Proc., Conf. on Human Factors in Computing Systems, 4165–4168. New York: Association for Computing Machinery. https://doi.org/10.1145/2702123.2702332.
Medeiros, A. C. S., P. Ratsamee, J. Orlosky, Y. Uranishi, M. Higashida, and H. Takemura. 2021. “3D pointing gestures as target selection tools: Guiding monocular UAVs during window selection in an outdoor environment.” ROBOMECH J. 8 (1): 1–19. https://doi.org/10.1186/s40648-021-00200-w.
Mitterberger, D., S. Ercan Jenny, L. Vasey, E. Lloret-Fritschi, P. Aejmelaeus-Lindström, F. Gramazio, and M. Kohler. 2022. “Interactive robotic plastering: Augmented interactive design and fabrication for on-site robotic plastering.” In Proc., Conf. on Human Factors in Computing Systems. New York: Association for Computing Machinery.
MX3D. 2021. “MX3D.” Accessed December 16, 2021. https://mx3d.com/.
Navas Medrano, S., M. Pfeiffer, and C. Kray. 2020. “Remote deictic communication: Simulating deictic pointing gestures across distances using electro muscle stimulation.” Int. J. Hum.-Comput. Interact. 36 (19): 1867–1882. https://doi.org/10.1080/10447318.2020.1801171.
Nickel, K., and R. Stiefelhagen. 2003. “Pointing gesture recognition based on 3D-tracking of face, hands and head orientation.” In Proc., ICMI’03: 5th Int. Conf. on Multimodal Interfaces, 140–146. New York: Association for Computing Machinery. https://doi.org/10.1145/958432.958460.
Okishiba, S., R. Fukui, M. Takagi, H. Azumi, S. Warisawa, R. Togashi, H. Kitaoka, and T. Ooi. 2019. “Tablet interface for direct vision teleoperation of an excavator for urban construction work.” Autom. Constr. 102 (Jun): 17–26. https://doi.org/10.1016/j.autcon.2019.02.003.
Oosterwijk, A. M., M. de Boer, A. Stolk, F. Hartmann, I. Toni, and L. Verhagen. 2017. “Communicative knowledge pervasively influences sensorimotor computations.” Sci. Rep. 7 (1): 1–12. https://doi.org/10.1038/s41598-017-04442-w.
Park, J., and Y. K. Cho. 2017. “Development and evaluation of a probabilistic local search algorithm for complex dynamic indoor construction sites.” J. Comput. Civ. Eng. 31 (4): 04017015. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000658.
Roldán, J. J., et al. 2019. “Multi-robot systems, virtual reality and ROS: Developing a new generation of operator interfaces.” In Vol. 778 of Robot Operating System (ROS). Studies in Computational Intelligence, 29–64. Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-91590-6_2.
Sauppé, A., and B. Mutlu. 2014. “Robot deictics: How gesture and context shape referential communication.” In Proc., ACM/IEEE Int. Conf. on Human-Robot Interaction, 342–349. New York: IEEE.
Sharma, A., R. Nett, and J. Ventura. 2020. “Unsupervised learning of depth and ego-motion from cylindrical panoramic video with applications for virtual reality.” Int. J. Semant. Comput. 14 (3): 333–356. https://doi.org/10.1142/S1793351X20400139.
Sprute, D., R. Rasch, A. Portner, S. Battermann, and M. Konig. 2018. “Gesture-based object localization for robot applications in intelligent environments.” In Proc., 2018 Int. Conf. on Intelligent Environments, IE 2018, 48–55. New York: IEEE.
Sprute, D., K. Tönnies, and M. Konig. 2019. “This far, no further: Introducing virtual borders to mobile robots using a laser pointer.” In Proc., 3rd IEEE Int. Conf. on Robotic Computing, IRC 2019, 403–408. New York: IEEE. https://doi.org/10.1109/IRC.2019.00074.
Steinfeld, A., T. Fong, D. Kaber, M. Lewis, J. Scholtz, A. Schultz, and M. Goodrich. 2006. “Common metrics for human-robot interaction.” In Proc., HRI 2006: 2006 ACM Conf. on Human-Robot Interaction, 33–40. New York: Association for Computing Machinery.
Tashtoush, T., L. Garcia, G. Landa, F. Amor, A. N. Laborde, D. Oliva, and F. Safar. 2021. “Human-robot interaction and collaboration (HRI-C) utilizing top-view RGB-D camera system.” Int. J. Adv. Comput. Sci. Appl. 12 (1): 10–17. https://doi.org/10.14569/IJACSA.2021.0120102.
Tavares, P., C. M. Costa, L. Rocha, P. Malaca, P. Costa, A. P. Moreira, A. Sousa, and G. Veiga. 2019. “Collaborative welding system using BIM for robotic reprogramming and spatial augmented reality.” Autom. Constr. 106 (Oct): 102825. https://doi.org/10.1016/j.autcon.2019.04.020.
Tölgyessy, M., M. Dekan, F. Duchoň, J. Rodina, P. Hubinský, and L. Chovanec. 2017. “Foundations of visual linear human–robot interaction via pointing gesture navigation.” Int. J. Social Rob. 9 (4): 509–523. https://doi.org/10.1007/s12369-017-0408-9.
Vasilyeva, M., and S. F. Lourenco. 2012. “Development of spatial cognition.” Wiley Interdiscip. Rev. Cognit. Sci. 3 (3): 349–362. https://doi.org/10.1002/wcs.1171.
Walkowski, S., R. Dörner, M. Lievonen, and D. Rosenberg. 2011. “Using a game controller for relaying deictic gestures in computer-mediated communication.” Int. J. Hum.-Comput. Stud. 69 (6): 362–374. https://doi.org/10.1016/j.ijhcs.2011.01.002.
Wang, X., C.-J. Liang, C. C. Menassa, and V. R. Kamat. 2021. “Interactive and immersive process-level digital twin for collaborative human–robot construction work.” J. Comput. Civ. Eng. 35 (6): 04021023. https://doi.org/10.1061/(ASCE)CP.1943-5487.0000988.
Weerakoon, D., V. Subbaraju, N. Karumpulli, T. Tran, Q. Xu, U. X. Tan, J. H. Lim, and A. Misra. 2020. “Gesture enhanced comprehension of ambiguous human-to-robot instructions.” Proc., 2020 Int. Conf. on Multimodal Interaction: ICMI 2020, 251–259. New York: Association for Computing Machinery.
Weng, T., L. Perlmutter, S. Nikolaidis, S. Srinivasa, and M. Cakmak. 2019. “Robot object referencing through legible situated projections.” In Proc., 2019 Int. Conf. on Robotics and Automation (ICRA), 8004–8010. New York: IEEE.
Whitney, D., E. Rosen, J. Macglashan, L. L. S. Wong, and S. Tellex. 2017. “Reducing errors in object-fetching interactions through social feedback.” In Proc., 2017 IEEE Int. Conf. on Robotics and Automation (ICRA), 1006–1013. New York: IEEE.
Williams, T., M. Bussing, S. Cabrol, E. Boyle, and N. Tran. 2019. “Mixed reality deictic gesture for multi-modal robot communication.” In Proc., ACM/IEEE Int. Conf. on Human-Robot Interaction, 191–201. New York: IEEE. https://doi.org/10.1109/HRI.2019.8673275.
Yongda, D., L. Fang, and X. Huang. 2018. “Research on multimodal human-robot interaction based on speech and gesture.” Comput. Electr. Eng. 72 (Nov): 443–454. https://doi.org/10.1016/j.compeleceng.2018.09.014.
Zamani, M. A., H. Beik-Mohammadi, M. Kerzel, S. Magg, and S. Wermter. 2018. “Learning spatial representation for safe human-robot collaboration in joint manual tasks.” In Proc., ICRA Workshop on the Workplace Is Better with Intelligent, Collaborative, Robot MATEs (WORKMATE). New York: IEEE.
Zhou, T., Q. Zhu, and J. Du. 2020. “Intuitive robot teleoperation for civil engineering operations with virtual reality and deep learning scene reconstruction.” Adv. Eng. Inf. 46 (Oct): 101170. https://doi.org/10.1016/j.aei.2020.101170.
Zwicker, C., and G. Reinhart. 2014. “Human-robot-collaboration system for a universal packaging cell for heavy electronic consumer goods.” In Proc., Enabling Manufacturing Competitiveness and Economic Sustainability: Proc., of the 5th Int. Conf. on Changeable, Agile, Reconfigurable and Virtual Production (CARV 2013), 195–199. Cham, Switzerland: Springer. https://doi.org/10.1007/978-3-319-02054-9_33.

Information & Authors

Information

Published In

Go to Journal of Construction Engineering and Management
Journal of Construction Engineering and Management
Volume 149Issue 7July 2023

History

Received: Aug 5, 2022
Accepted: Mar 2, 2023
Published online: Apr 28, 2023
Published in print: Jul 1, 2023
Discussion open until: Sep 28, 2023

Permissions

Request permissions for this article.

ASCE Technical Topics:

Authors

Affiliations

Ph.D. Student, Dept. of Architecture and Architectural Engineering, Seoul National Univ., Seoul 08826, Republic of Korea. ORCID: https://orcid.org/0000-0003-4997-5792. Email: [email protected]
Ph.D. Student, School of Civil and Environmental Engineering, Georgia Institute of Technology, Atlanta, GA 30332. ORCID: https://orcid.org/0000-0001-7953-4176. Email: [email protected]
Moonseo Park, M.ASCE [email protected]
Professor, Dept. of Architecture and Architectural Engineering, Seoul National Univ., Seoul 08826, Republic of Korea. Email: [email protected]
Associate Professor, Dept. of Architecture and Architectural Engineering, Institute of Construction and Environmental Engineering, Seoul National Univ., Seoul 08826, Republic of Korea (corresponding author). ORCID: https://orcid.org/0000-0002-6733-2216. Email: [email protected]

Metrics & Citations

Metrics

Citations

Download citation

If you have the appropriate software installed, you can download article citation data to the citation manager of your choice. Simply select your manager software from the list below and click Download.

Cited by

  • LaserDex: Improvising Spatial Tasks Using Deictic Gestures and Laser Pointing for Human–Robot Collaboration in Construction, Journal of Computing in Civil Engineering, 10.1061/JCCEE5.CPENG-5715, 38, 3, (2024).

View Options

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Get Access

Access content

Please select your options to get access

Log in/Register Log in via your institution (Shibboleth)
ASCE Members: Please log in to see member pricing

Purchase

Save for later Information on ASCE Library Cards
ASCE Library Cards let you download journal articles, proceedings papers, and available book chapters across the entire ASCE Library platform. ASCE Library Cards remain active for 24 months or until all downloads are used. Note: This content will be debited as one download at time of checkout.

Terms of Use: ASCE Library Cards are for individual, personal use only. Reselling, republishing, or forwarding the materials to libraries or reading rooms is prohibited.
ASCE Library Card (5 downloads)
$105.00
Add to cart
ASCE Library Card (20 downloads)
$280.00
Add to cart
Buy Single Article
$35.00
Add to cart

Media

Figures

Other

Tables

Share

Share

Copy the content Link

Share with email

Email a colleague

Share