IEEE Access (Jan 2019)

3D No-Reference Image Quality Assessment via Transfer Learning and Saliency-Guided Feature Consolidation

  • Xiaogang Xu,
  • Bufan Shi,
  • Zijin Gu,
  • Ruizhe Deng,
  • Xiaodong Chen,
  • Andrey S. Krylov,
  • Yong Ding

DOI
https://doi.org/10.1109/ACCESS.2019.2925084
Journal volume & issue
Vol. 7
pp. 85286 – 85297

Abstract

Read online

Motivated by the success of convolutional neural networks (CNNs) in image-related applications, in this paper, we design an effective method for no-reference 3D image quality assessment (3D IQA) through CNN-based feature extraction and consolidation strategy. In the first and most vital stage, quality-aware features, which reflect the inherent quality of images, are extracted by a fine-tuned CNN model exploiting the concept of transfer learning. This fine-tuning strategy solves the large-scale training data dependence existing in current deep-learning-based IQA algorithms. In the second stage, features from the left and right view are consolidated by linear weighted fusion where the weight for each image is obtained from its saliency map. In addition, the statistical characteristics of the disparity map are also considered in a multi-scale manner as additional features. In the final stage of quality mapping, the objective score for each stereoscopic pair is gained by support vector regression. The experimental results on the public databases show that our approach outperforms many existing no-reference and even full-reference methods.

Keywords