Task scheduling based on deep reinforcement learning in a cloud manufacturing environment
Summary Cloud manufacturing promotes the transformation of intelligence for the traditional manufacturing mode. In a cloud manufacturing environment, the task scheduling plays an important role. However, as the number of problem instances increases, the solution quality and computation time always g...
Saved in:
Published in | Concurrency and computation Vol. 32; no. 11 |
---|---|
Main Authors | , , , |
Format | Journal Article |
Language | English |
Published |
Hoboken
Wiley Subscription Services, Inc
10.06.2020
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | Summary
Cloud manufacturing promotes the transformation of intelligence for the traditional manufacturing mode. In a cloud manufacturing environment, the task scheduling plays an important role. However, as the number of problem instances increases, the solution quality and computation time always go against. Existing task scheduling algorithms can get local optimal solutions with the high computational cost, especially for large problem instances. To tackle this problem, a task scheduling algorithm based on a deep reinforcement learning architecture (RLTS) is proposed to dynamically schedule tasks with precedence relationship to cloud servers to minimize the task execution time. Meanwhile, the Deep‐Q‐Network, as a kind of deep reinforcement learning algorithms, is employed to consider the problem of complexity and high dimension. In the simulation, the performance of the proposed algorithm is compared with other four heuristic algorithms. The experimental results show that RLTS can be effective to solve the task scheduling in a cloud manufacturing environment. |
---|---|
Bibliography: | ObjectType-Article-1 SourceType-Scholarly Journals-1 ObjectType-Feature-2 content type line 14 |
ISSN: | 1532-0626 1532-0634 |
DOI: | 10.1002/cpe.5654 |