Enhancing Built Environment Management: A Vision-Based Approach Unified with Fiducial Markers and Omnidirectional Camera Pose Estimation

Authors

  • Gelare Taherian Toronto Metropolitan University
  • Ehsan Rezazadeh Azar Toronto Metropolitan University

Abstract

Construction jobsites are characterized by their dynamically changing environment driving the need for improved managerial methods, such as automated progress tracking and quality inspection. These applications rely on detecting physical changes, which necessitate accurate data capturing technologies and analysis of as-is against the as-planned status. Vision-based methods have emerged as promising tools for localizing the real-time query thanks to the availability of low-cost data capturing, transferring, storing, and computing systems. However, they should overcome certain challenges to reliably detect and locate construction resources and building elements in frames, especially in complex indoor environments. Despite significant advancements, recent approaches still require considerable effort to perform reliably, such as creating comprehensive 3D point clouds through frequent 3D reconstruction using overlapping images. This paper proposes a new approach inspired by recent innovations in computer science which can reduce the data capturing efforts needed in other approaches. The method proposes the use of omnidirectional images and detecting the fiducial markers in the generated visualizations to capture the asis status and further retrieve the relative pose in the as-planned status through Building Information Modeling (BIM). This approach facilitates comparative analysis through state-of-the-art computer visionbased object detection and classification methods for change detection between the as-is query and the as-planned status across a wide view. Additionally, it moderates camera pose estimation efforts, enhancing efficiency for various built environment management applications, including construction progress tracking.

Downloads

Published

2024-06-26

Conference Proceedings Volume

Section

Academic Papers