• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Hu, Runze (Hu, Runze.) | Liu, Yutao (Liu, Yutao.) | Gu, Ke (Gu, Ke.) (Scholars:顾锞) | Min, Xiongkuo (Min, Xiongkuo.) | Zhai, Guangtao (Zhai, Guangtao.)

Indexed by:

EI Scopus SCIE

Abstract:

Existing no-reference (NR) image quality assessment (IQA) metrics are still not convincing for evaluating the quality of the camera-captured images. Toward tackling this issue, we, in this article, establish a novel NR quality metric for quantifying the quality of the camera-captured images reliably. Since the image quality is hierarchically perceived from the low-level preliminary visual perception to the high-level semantic comprehension in the human brain, in our proposed metric, we characterize the image quality by exploiting both the low-level image properties and the high-level semantics of the image. Specifically, we extract a series of low-level features to characterize the fundamental image properties, including the brightness, saturation, contrast, noiseness, sharpness, and naturalness, which are highly indicative of the camera-captured image quality. Correspondingly, the high-level features are designed to characterize the semantics of the image. The low-level and high-level perceptual features play complementary roles in measuring the image quality. To infer the image quality, we employ the support vector regression (SVR) to map all the informative features to a single quality score. Thorough tests conducted on two standard camera-captured image databases demonstrate the effectiveness of the proposed quality metric in assessing the image quality and its superiority over the state-of-the-art NR quality metrics. The source code of the proposed metric for camera-captured images is released at https://github.com/YT2015?tab=repositories.

Keyword:

Predictive models blind Measurement Semantics Image quality Camera-captured image no-reference (NR) Feature extraction Visual perception Visualization image quality assessment (IQA) deep neural network (DNN)

Author Community:

  • [ 1 ] [Hu, Runze]Tsinghua Univ, Tsinghua Shenzhen Int Grad Sch, Dept Informat Sci & Technol, Shenzhen 518055, Peoples R China
  • [ 2 ] [Liu, Yutao]Ocean Univ China, Sch Comp Sci & Technol, Qingdao 266100, Peoples R China
  • [ 3 ] [Gu, Ke]Beijing Univ Technol, Fac Informat Technol, Beijing 100124, Peoples R China
  • [ 4 ] [Min, Xiongkuo]Shanghai Jiao Tong Univ, Inst Image Commun & Network Engn, Shanghai 200240, Peoples R China
  • [ 5 ] [Zhai, Guangtao]Shanghai Jiao Tong Univ, Inst Image Commun & Network Engn, Shanghai 200240, Peoples R China
  • [ 6 ] [Zhai, Guangtao]Shanghai Jiao Tong Univ, MoE Key Lab Artificial Intelligence, AI Inst, Shanghai 200240, Peoples R China

Reprint Author's Address:

Show more details

Related Keywords:

Source :

IEEE TRANSACTIONS ON CYBERNETICS

ISSN: 2168-2267

Year: 2021

Issue: 6

Volume: 53

Page: 3651-3664

1 1 . 8 0 0

JCR@2022

ESI Discipline: COMPUTER SCIENCE;

ESI HC Threshold:87

JCR Journal Grade:1

Cited Count:

WoS CC Cited Count: 30

SCOPUS Cited Count: 21

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 12

Affiliated Colleges:

Online/Total:1181/10536395
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.