No-reference screen content video quality assessment
•A no-reference VQA model designed specifically for screen content videos is proposed.•In this model, video quality is aggregated from a multi-scale approach with extraction of several groups of features.•Through training with labeled videos, the model map the frame feature vectors to quality scores...
Saved in:
Published in | Displays Vol. 69; p. 102030 |
---|---|
Main Authors | , , , , |
Format | Journal Article |
Language | English |
Published |
Elsevier B.V
01.09.2021
|
Subjects | |
Online Access | Get full text |
Cover
Loading…
Summary: | •A no-reference VQA model designed specifically for screen content videos is proposed.•In this model, video quality is aggregated from a multi-scale approach with extraction of several groups of features.•Through training with labeled videos, the model map the frame feature vectors to quality scores by SVR.•Experiments show that our proposed model outperforms the existing full- and no-reference quality evaluation metrics.
How to effectively and accurately measure the degradation of media content is an important research topic in the field of image or video processing. Application scenarios such as online meetings, distance learning, and live game streaming make screen content video become a hot spot in Video Quality Assessment (VQA) research. However, to the best of our knowledge, there is currently no no-reference VQA model designed specifically for screen content videos. In this paper, we propose a blind VQA model for screen content videos. This model first uses a multi-scale approach to extract several groups of features, including gradient features, relative standard deviation features, compression features, frequency domain features and inter-frame features. Through training with labeled videos, the model then uses support vector regressor to map the frame feature vectors to video quality scores. We validate the model on the CSCVQ database. Experiments show that our proposed model outperforms the existing full- and no-reference quality evaluation metrics and is also competitive in terms of stability and computational efficiency. |
---|---|
ISSN: | 0141-9382 1872-7387 |
DOI: | 10.1016/j.displa.2021.102030 |