Utilizing binocular vision to facilitate completely blind 3D image quality measurement

被引:22
作者
Zhou, Wujie [1 ,2 ]
Yu, Lu [2 ]
Qiu, Weiwei [1 ]
Luo, Ting [3 ]
Wang, Zhongpeng [1 ]
Wu, Ming-Wei [1 ]
机构
[1] Zhejiang Univ Sci & Technol, Sch Informat & Elect Engn, Hangzhou 310023, Zhejiang, Peoples R China
[2] Zhejiang Univ, Coll Informat Sci & Elect Engn, Hangzhou 310027, Zhejiang, Peoples R China
[3] Ningbo Univ, Coll Sci & Technol, Ningbo 315211, Zhejiang, Peoples R China
基金
中国国家自然科学基金; 中国博士后科学基金;
关键词
3D image quality measurement; Binocular vision; Completely blind; Pristine multivariate Gaussian model; NATURAL IMAGES; FUSION; EMERGENCE; MODELS;
D O I
10.1016/j.sigpro.2016.06.005
中图分类号
TM [电工技术]; TN [电子技术、通信技术];
学科分类号
0808 ; 0809 ;
摘要
In the field of practical three-dimensional (3D) applications, blind measurement of the perceptual quality of distorted 3D images remains a challenging research topic. In this paper, we propose a completely blind 3D image quality measurement (IQM) metric that utilizes a binocular vision mechanism to better align with human perception. As its primary focus, this study is inspired by the visual processing in the primary visual cortex (V1) and the higher visual areas (V2) of binocular vision to facilitate blind 3D-IQM. Furthermore, the proposed metric does not require distorted samples or human subjective opinion scores for training. More specifically, the binocular quality-predictive features of areas V1 and V2 are first extracted from a corpus of pristine natural 3D images. Subsequently, a pristine multivariate Gaussian (MVG) model is trained from the extracted features. Finally, with the trained MVG model, the quality of distorted 3D images is measured using a Mahalanobis distance. Experimental results using two public benchmark 3D databases show that in comparison with current state-of-the-art IQM metrics, the proposed metric achieves excellent prediction performance. (C) 2016 Elsevier B.V. All rights reserved.
引用
收藏
页码:130 / 136
页数:7
相关论文
共 40 条
  • [1] Akhter R., 2010, P SOC PHOTO-OPT INS, V7525
  • [2] Strength and coherence of binocular rivalry depends on shared stimulus complexity
    Alais, David
    Melcher, David
    [J]. VISION RESEARCH, 2007, 47 (02) : 269 - 279
  • [3] [Anonymous], 2010, P INT WORKSH VID PRO
  • [4] THE PRECEDENCE OF BINOCULAR FUSION OVER BINOCULAR-RIVALRY
    BLAKE, R
    BOOTHROYD, K
    [J]. PERCEPTION & PSYCHOPHYSICS, 1985, 37 (02): : 114 - 124
  • [5] Full-reference quality assessment of stereopairs accounting for rivalry
    Chen, Ming-Jun
    Su, Che-Chun
    Kwon, Do-Kyoung
    Cormack, Lawrence K.
    Bovik, Alan C.
    [J]. SIGNAL PROCESSING-IMAGE COMMUNICATION, 2013, 28 (09) : 1143 - 1155
  • [6] No-Reference Quality Assessment of Natural Stereopairs
    Chen, Ming-Jun
    Cormack, Lawrence K.
    Bovik, Alan C.
    [J]. IEEE TRANSACTIONS ON IMAGE PROCESSING, 2013, 22 (09) : 3379 - 3391
  • [7] RELATIONS BETWEEN THE STATISTICS OF NATURAL IMAGES AND THE RESPONSE PROPERTIES OF CORTICAL-CELLS
    FIELD, DJ
    [J]. JOURNAL OF THE OPTICAL SOCIETY OF AMERICA A-OPTICS IMAGE SCIENCE AND VISION, 1987, 4 (12): : 2379 - 2394
  • [8] Neural encoding of binocular disparity: Energy models, position shifts and phase shifts
    Fleet, DJ
    Wagner, H
    Heeger, DJ
    [J]. VISION RESEARCH, 1996, 36 (12) : 1839 - 1857
  • [9] Biologically inspired image quality assessment
    Gao, Fei
    Yu, Jun
    [J]. SIGNAL PROCESSING, 2016, 124 : 210 - 219
  • [10] Binocular fusion and invariant category learning due to predictive remapping during scanning of a depthful scene with eye movements
    Grossberg, Stephen
    Srinivasan, Karthik
    Yazdanbakhsh, Arash
    [J]. FRONTIERS IN PSYCHOLOGY, 2015, 5