Smart seru production system for Industry 4.0: a conceptual model based on deep learning for real-time monitoring and controlling
Seru production system is an innovative assembly system and combines the flexibility of job shop production and high efficiency of assembly lines. This production system easily adapts to product replacements, reduces product wastages by eliminating work-in-process inventories and delay time, and pro...
Saved in:
Published in | International journal of computer integrated manufacturing Vol. 37; no. 4; pp. 385 - 407 |
---|---|
Main Authors | , , |
Format | Journal Article |
Language | English |
Published |
Taylor & Francis
02.04.2024
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Seru production system is an innovative assembly system and combines the flexibility of job shop production and high efficiency of assembly lines. This production system easily adapts to product replacements, reduces product wastages by eliminating work-in-process inventories and delay time, and provides enterprises with a competitive advantage through reducing the operating costs, required workforce, and space. Besides these advantages, the disadvantage of seru type production is that, in assembly lines, specific tasks are completed in specified stations, whereas in seru production, the tasks required for the assembly of a product are completed in a yatai by a cross-trained worker. This, in turn, results in a higher risk of production errors. Accordingly in this research, a conceptual model is proposed to monitor and control multiple factors of the production process like worker, environment, assembly tools, ergonomics, storage and inventory and issue warning for prevention of process and quality errors in seru production by use of advanced analytics based on deep learning. In addition to providing support to the worker, the proposed Smart Seru Production System Model will assist system participants in obtaining and understanding data about the production processes and reacting quickly based on this information. |
---|---|
ISSN: | 0951-192X 1362-3052 |
DOI: | 10.1080/0951192X.2022.2078514 |