• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Cheng, Bo (Cheng, Bo.) | Zhuo, Li (Zhuo, Li.) | Zhang, Jing (Zhang, Jing.) (Scholars:张菁)

Indexed by:

CPCI-S EI Scopus

Abstract:

Dimensionality reduction plays a significant role for the performance of large-scale image retrieval. In this paper, various dimensionality reduction methods are compared to validate their own performance in image retrieval. For this purpose, first, the Scale Invariant Feature Transform (SIFT) features and HSV (Hue, Saturation, Value) histogram are extracted as image features. Second, the Principal Component Analysis (PCA), Fisher Linear Discriminant Analysis (FLDA), Local Fisher Discriminant Analysis (LFDA), Isometric Mapping (ISOMAP), Locally Linear Embedding (LLE), and Locality Preserving Projections (LPP) are respectively applied to reduce the dimensions of SIFT feature descriptors and color information, which can be used to generate vocabulary trees. Finally, through setting the match weights of vocabulary trees, large-scale image retrieval scheme is implemented. By comparing multiple sets of experimental data from several platforms, it can be concluded that dimensionality reduction method of LLE and LPP can effectively reduce the computational cost of image features, and maintain the high retrieval performance as well.

Keyword:

vocabulary tree dimensionality reduction Scale Invariant Feature Transform HSV histogram Large-scale image retrieval

Author Community:

  • [ 1 ] [Cheng, Bo]Beijing Univ Technol, Signal & Informat Proc Lab, Beijing, Peoples R China
  • [ 2 ] [Zhuo, Li]Beijing Univ Technol, Signal & Informat Proc Lab, Beijing, Peoples R China
  • [ 3 ] [Zhang, Jing]Beijing Univ Technol, Signal & Informat Proc Lab, Beijing, Peoples R China

Reprint Author's Address:

  • [Cheng, Bo]Beijing Univ Technol, Signal & Informat Proc Lab, Beijing, Peoples R China

Show more details

Related Keywords:

Related Article:

Source :

2013 IEEE INTERNATIONAL SYMPOSIUM ON MULTIMEDIA (ISM)

Year: 2013

Page: 445-450

Language: English

Cited Count:

WoS CC Cited Count: 6

SCOPUS Cited Count: 11

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 1

Online/Total:356/10625873
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.