• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Gu, Ke (Gu, Ke.) (Scholars:顾锞) | Qiao, Junfei (Qiao, Junfei.) (Scholars:乔俊飞) | Lee, Sanghoon (Lee, Sanghoon.) | Liu, Hantao (Liu, Hantao.) | Lin, Weisi (Lin, Weisi.) | Le Callet, Patrick (Le Callet, Patrick.)

Indexed by:

EI Scopus SCIE

Abstract:

This paper proposes to blindly evaluate the quality of images synthesized via a depth image-based rendering (DIBR) procedure. As a significant branch of virtual reality (VR), superior DIBR techniques provide free viewpoints in many real applications, including remote surveillance and education; however, limited efforts have been made to measure the performance of DIBR techniques, or equivalently the quality of DIBR-synthesized views, especially in the condition when references are unavailable. To achieve this aim, we develop a novel blind image quality assessment (IQA) method via multiscale natural scene statistical analysis (MNSS). The design principle of our proposed MNSS metric is based on two new natural scene statistics (NSS) models specific to the DBIR-synthesized IQA. First, the DIBR-introduced geometric distortions damage the local self-similarity characteristic of natural images, and the damage degrees of self-similarity present particular variations at different scales. Systematically combining the measurements of the variations mentioned above can gauge the naturalness of the input image and thus indirectly reflect the quality changes of images generated using different DIBR methods. Second, it was found that the degradations in main structures of natural images at different scales remain almost the same, whereas the statistical regularity is destroyed in the DIBR-synthesized views. Estimating the deviation of degradations in main structures at different scales between one DIBR-synthesized image and the statistical model, which is constructed based on a large number of natural images, can quantify how a DIBR method damages the main structures and thus infer the image quality. Via trials, the two NSS-based features extracted above can well predict the quality of DIBR-synthesized images. Further, the two features come from distinct points of view, and we hence integrate them via a straightforward multiplication to derive the proposed blind MNSS metric, which achieves better performance than each component and state-of-the-art quality methods.

Keyword:

Feature extraction no-reference (NR) Statistical analysis image quality assessment (IQA) Distortion Image quality multiscale natural scene statistical analysis Rendering (computer graphics) Measurement Depth image-based rendering (DIBR) Degradation blind

Author Community:

  • [ 1 ] [Gu, Ke]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing Adv Innovat Ctr Future Internet Technol, Beijing 100124, Peoples R China
  • [ 2 ] [Qiao, Junfei]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing Adv Innovat Ctr Future Internet Technol, Beijing 100124, Peoples R China
  • [ 3 ] [Lee, Sanghoon]Yonsei Univ, Dept Elect & Elect Engn, Seoul 03722, South Korea
  • [ 4 ] [Liu, Hantao]Cardiff Univ, Sch Comp Sci & Informat, Cardiff CF24 3AA, Wales
  • [ 5 ] [Lin, Weisi]Nanyang Technol Univ, Sch Comp Sci & Engn, Singapore 639798, Singapore
  • [ 6 ] [Le Callet, Patrick]Univ Nantes, Luman Univ, Polytech Nantes, CNRS,IRCCyN,UMR, F-6597 Nantes, France

Reprint Author's Address:

  • 顾锞

    [Gu, Ke]Beijing Univ Technol, Beijing Key Lab Computat Intelligence & Intellige, Fac Informat Technol, Beijing Adv Innovat Ctr Future Internet Technol, Beijing 100124, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

IEEE TRANSACTIONS ON BROADCASTING

ISSN: 0018-9316

Year: 2020

Issue: 1

Volume: 66

Page: 127-139

4 . 5 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:132

Cited Count:

WoS CC Cited Count: 62

SCOPUS Cited Count: 73

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Online/Total:762/10548491
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.