Development of an Intuitive Mixed Reality Human Robot Interaction Interface for Construction Applications


  • Damith Tennakoon Lassonde School of Engineering, York University, CA
  • Mojgan Jadidi Lassonde School of Engineering, York University, CA
  • Seyedreza Razavialavi Northumbria University, New Castle, UK


Mixed Reality; Human Robot Interaction; Robotics; Stereo Vision; Hand Tracking; Construction Robotics


The main challenges faced in the construction industry are low productivity rates and the lack of priority given to worker safety. Robotics solutions have been proven to improve task efficiency, product quality, and workplace safety in many industries, though it has posed a challenge to be integrated into the dynamic and unstructured construction environment. This research aims to provide a mode to adopt robotics into the construction industry through an intuitive Human-Robot Interaction (HRI) system that integrates the precision, strength, and deployability of robots while leveraging the decision-making, experience, and creativity of on-site workers. The methodology consists of mounting the ZEDM stereo vision camera, attached on a 2-axis servo actuated gimbal, to the end effector of a 6-axis robot arm. The stereo images are streamed to the Meta Quest 2 virtual reality (VR) head-mounted display (HMD), providing a First-Person View (FPV), with human-like vision, of the end effector. Hand tracking libraries are used to track and map the operator’s hands into the FPV view. Localizing the tracked hand models with the end effector enables omnidirectional motion control and gripper controls, via gesture recognition, of the manipulator. In-situ data is spatially mapped to the stereoscopic view, constructing a mixed-reality (MR) HRI interface. The results demonstrate a higher depth and situational awareness of the robot’s workspace, with increased efficiency, relative to traditional keyboard teleoperation methods. This MR HRI interface facilitates the use of robotics in construction, enabling workers to safely complete hazardous tasks remotely using HMDs and hand gestures.




Conference Proceedings Volume


Academic Papers