FR-UGC: A Novel Full-reference Quality Metric for UGC Transcoding

Zihao Qi

Shan Liu

Xiaozhong Xu

University of Bristol

Abstract

Unlike conventional video coding of pristine professional content, the delivery pipeline of User Generated Content (UGC) involves transcoding where unpristine reference videos need to be compressed repeatedly. In this work, we find that existing full-/no-reference quality metrics often assume the lack of coding-induced distortions in the reference sequences. Therefore these approaches cannot accurately predict the perceptual quality difference between transcoded UGC content and the corresponding unpristine references, hence are unsuited for guiding the rate-distortion optimisation process in the transcoding systems. In this context, we propose a bespoke full-reference deep video quality metric for UGC transcoding. The proposed method features a transcoding-specific weakly supervised training strategy employing a quality ranking-based Siamese structure. The proposed method is evaluated on the YouTube-UGC VP9 subset and the LIVE-Wild database, showing state-of-the-art performance compared to existing VQA methods.


Generation of FR-UGC training data.

Training datasets is generated in two steps. First pristine source sequences are compressed by H.264 to obtain distorted reference. Then, the distorted reference is further compressed by 4 popular codecs into 12 transcoded variants.