Machine Learning for Edge-Aware Resource Orchestration for IoT Applications

The market is experiencing a huge avalanche on the Internet-of-Thing (IoT) devices count and their data traffic from wearable personal devices to smart enterprise applications. This sensory data is playing a crucial role in our day-to-day life and enterprise's products and business decision-mak...

Full description

Saved in:
Bibliographic Details
Published in2021 IEEE Global Conference on Artificial Intelligence and Internet of Things (GCAIoT) pp. 37 - 44
Main Authors Jammal, Manar, AbuSharkh, Mohamed
Format Conference Proceeding
LanguageEnglish
Published IEEE 12.12.2021
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:The market is experiencing a huge avalanche on the Internet-of-Thing (IoT) devices count and their data traffic from wearable personal devices to smart enterprise applications. This sensory data is playing a crucial role in our day-to-day life and enterprise's products and business decision-making process. Although this data provides promising business insights and can enhance applications' performance, it comes with different challenges including connectivity, dynamic resource demands, privacy, and others. Therefore, the infrastructure of the IoT applications must be well-orchestrated and accompanied with intelligence in mind to scale, self-organize, and handle the huge data volume and transmission. The intelligent platform is expected to self-explore workloads and autonomously allocate computing resources at runtime to assist the IoT system in achieving its best intrinsic value. Hence, this paper introduces a novel platform consisting of various machine learning (ML) techniques and optimization model to forecast the IoT applications' behavior and best deploy such applications dynamically on the edge to meet and enhance the overall end-to-end application performance. The comparative analysis has shown that the Random Forest model has promising results for resource forecasting. Also, the proposed deployment optimization model shows the importance of providing a tradeoff between computing, delay/transmission rate, and computational offloading constraints.
DOI:10.1109/GCAIoT53516.2021.9692940