Effects of spatial augmented reality assistance on the efficiency of prefabricating timber frame walls

Assistance systems have the potential to support workers in various assembly tasks. Especially in mass customized prefabrication, the correct objects need to be selected and positioned accurately. Using conventional paper plans as assembly guidance, skilled workers need to spend precious time carefu...

Full description

Saved in:
Bibliographic Details
Published inWood material science and engineering Vol. 18; no. 3; pp. 860 - 869
Main Authors Bartuska, Birger, Teischinger, Alfred, Riegler, Martin
Format Journal Article
LanguageEnglish
Published Abingdon Taylor & Francis 04.05.2023
Taylor & Francis Ltd
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Assistance systems have the potential to support workers in various assembly tasks. Especially in mass customized prefabrication, the correct objects need to be selected and positioned accurately. Using conventional paper plans as assembly guidance, skilled workers need to spend precious time carefully reading plans for the individually designed elements. Augmented reality (AR) systems can help to integrate information for assembly tasks directly into the work environment, which should improve time efficiency and error rates. However, AR systems are often not ergonomic or accurate enough (e.g. smartglasses) or expensive (e.g. laser projectors). In the present study, the effect of a simple projection based spatial augmented reality (SAR) system was studied for building prefabricated wall elements under laboratory conditions, using printed CAD drawings as reference. Each subject built a timber frame wall element with dimensions close to reality. Using the SAR system, the completion times were reduced by nearly 50%, the error rate of subjects with a low experience level decreased and the reported task load measured according to the NASA raw task load index using the SAR system was significantly lower.
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 14
content type line 23
ISSN:1748-0272
1748-0280
1748-0280
DOI:10.1080/17480272.2022.2085528