IEEE Access (Jan 2020)

Dual Head Network for No-Reference Quality Assessment Towards Realistic Night-Time Images

  • Bowen Li,
  • Xianpei Wang,
  • Weixia Zhang,
  • Meng Tian,
  • Hongtai Yao

DOI
https://doi.org/10.1109/ACCESS.2020.3020750
Journal volume & issue
Vol. 8
pp. 158585 – 158599

Abstract

Read online

No-reference image quality assessment (NR-IQA), which devotes to predicting image quality without relying on the corresponding pristine counterpart, develops rapidly in recent years. However, little investigation has been dedicated to quality assessment of realistic night-time images. Existing NR-IQA algorithms laboriously cope with this night-time scenario since complicated authentic distortions such as low contrast, blurred details, and reduced visibility usually appear on it. In this paper, we propose an end-to-end NR-IQA model to meet this challenge based on a multi-stream deep convolutional neural network (DCNN). Two streams, brightness-aware CNN and naturalness-aware CNN are constructed respectively by a brightness-altered image identification task with a self-established dataset and a quality-prediction regression task with an existing authentically-distorted IQA dataset to improve quality-aware initializations. In this case, given the quick convergence and little transformation in the lower layers, a shallow-layer-shared architecture is explored to reduce computational cost. Finally, the features of these two pipelines are collected by an effective pooling method and then concatenated as the image representation for fine-tuning. The effectiveness and efficiency of the proposed method are verified by several different experiments on the NNID, CCRIQ and LIVE Challenge databases. Furthermore, the superiority of wide applications such as for contrast-distorted and driving scenarios is demonstrated on the CID2013, CCID2014 and BBD-100k databases.

Keywords