Performance Assessment of Deep Neural Networks for Classification of IFC Objects From Point Cloud With Limited Labeled Data
DOI:
https://doi.org/10.57922/tcrc.619Keywords:
Deep Neural Networks (DNN), Industry Foundation Classes (IFC), Building Information Modeling (BIM), Object classification, Scan-to-BIMAbstract
Point cloud (PC) processing using deep neural networks (DNNs) has been attracting an increasing amount of attention due to its high performance in tasks, such as Industry Foundation Classes (IFC)-based object classification. DNNs typically rely on large labeled-sample sizes for training. On the other hand, 3D IFC object annotation is a time-consuming task, and the existing datasets are only small sized labeled samples. In this study, we perform a set of experiments to assess the training progress and generalization capacity of two state-of-the-art DNNs for IFC object classification. The results show that limited training samples can lead to inefficient learning by DNNs, even if a large portion of the annotated samples are allocated to the training task. The conducted experiments indicate that overfitting on the training set as well as low speed convergence during the training are some of the main issues for classification of 3D IFC objects using DNNs. To tackle these issues, future studies could take advantage of unsupervised and semi-supervised learning methods to use a greater volume of unlabeled samples for training and decrease the reliance of DNNs on annotated data.
Downloads
Published
Conference Proceedings Volume
Section
License
Copyright (c) 2022 University of New Brunswick
This work is licensed under a Creative Commons Attribution 4.0 International License.
Authors/employers retain all proprietary rights in any process, procedure or article of manufacture described in the Work. Authors/employers may reproduce or authorize others to reproduce the Work, material extracted verbatim from the Work, or derivative works for the author’s personal use or for company use, provided that the source is indicated.