Moving Deep Learning to the Edge

Deep learning is now present in a wide range of services and applications, replacing and complementing other machine learning algorithms. Performing training and inference of deep neural networks using the cloud computing model is not viable for applications where low latency is required. Furthermor...

Full description

Saved in:
Bibliographic Details
Published inAlgorithms Vol. 13; no. 5; p. 125
Main Authors Véstias, Mário P., Duarte, Rui Policarpo, de Sousa, José T., Neto, Horácio C.
Format Journal Article
LanguageEnglish
Published MDPI AG 01.05.2020
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Deep learning is now present in a wide range of services and applications, replacing and complementing other machine learning algorithms. Performing training and inference of deep neural networks using the cloud computing model is not viable for applications where low latency is required. Furthermore, the rapid proliferation of the Internet of Things will generate a large volume of data to be processed, which will soon overload the capacity of cloud servers. One solution is to process the data at the edge devices themselves, in order to alleviate cloud server workloads and improve latency. However, edge devices are less powerful than cloud servers, and many are subject to energy constraints. Hence, new resource and energy-oriented deep learning models are required, as well as new computing platforms. This paper reviews the main research directions for edge computing deep learning algorithms.
ISSN:1999-4893
1999-4893
DOI:10.3390/a13050125