A NARX Neural Network Algorithm for Video Traffic Prediction

Multimedia services like video on demand, video broadcasting or videoconferencing became a major part of the internet network traffic. The bursty characteristics of the video traffic make it difficult to fulfill the Quality of Service (QoS) of the specific multimedia applications. Therefore it is im...

Full description

Saved in:
Bibliographic Details
Published inJournal of Electrical and Electronics Engineering Vol. 4; no. 1; p. 179
Main Authors Pilka, Filip, Oravec, Milos
Format Journal Article
LanguageEnglish
Published Oradea University of Oradea 01.01.2011
Subjects
Online AccessGet full text

Cover

Loading…
More Information
Summary:Multimedia services like video on demand, video broadcasting or videoconferencing became a major part of the internet network traffic. The bursty characteristics of the video traffic make it difficult to fulfill the Quality of Service (QoS) of the specific multimedia applications. Therefore it is important to utilize congestion control procedures. One of the procedures can be traffic prediction and dynamic bandwidth allocation. Neural networks belong to vastly used methods for traffic prediction. In this paper, we present the results of the Nonlinear AutoRegressive model with eXogeneous inputs (NARX) neural network for video traffic prediction and propose a new algorithm for video traffic prediction using neural networks based on the separation of different frames. At first we briefly describe the characteristics of the video traffic. Then we introduce theoretical fundamentals of the NARX neural network. In the last section we present the results of video traffic prediction using NARX neural network and the new algorithm for video traffic prediction. For comparison purposes the prediction using the multilayer perceptron and the adaptive autoregressive integrated moving average (ARIMA) is included. [PUBLICATION ABSTRACT]
Bibliography:ObjectType-Article-1
SourceType-Scholarly Journals-1
ObjectType-Feature-2
content type line 23
ISSN:1844-6035
2067-2128