• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Luo, Huizhang (Luo, Huizhang.) | Huang, Dan (Huang, Dan.) | Liu, Qing (Liu, Qing.) | Qiao, Zhenbo (Qiao, Zhenbo.) | Jiang, Hong (Jiang, Hong.) | Bi, Jing (Bi, Jing.) | Yuan, Haitao (Yuan, Haitao.) | Zhou, Mengchu (Zhou, Mengchu.) | Wang, Jinzhen (Wang, Jinzhen.) | Qin, Zhenlu (Qin, Zhenlu.)

Indexed by:

CPCI-S EI Scopus

Abstract:

With the high volume and velocity of scientific data produced on high-performance computing systems, it has become increasingly critical to improve the compression performance. Leveraging the general tolerance of reduced accuracy in applications, lossy compressors can achieve much higher compression ratios with a user-prescribed error bound. However, they are still far from satisfying the reduction requirements from applications. In this paper, we propose and evaluate the idea that data need to be preconditioned prior to compression, such that they can better match the design philosophies of a compressor. In particular, we aim to identify a reduced model that can be utilized to transform the original data to a more compressible form. We begin with a case study of Heat3d as a proof of concept, in which we demonstrate that a reduced model can indeed reside in the full model output, and can be utilized to improve compression ratios. We further explore more general dimension reduction techniques to extract the reduced model, including principal component analysis, singular value decomposition, and discrete wavelet transform. After preconditioning, the reduced model in conjunction with difference between the reduced model and full model is stored, which results in higher compression ratios. We evaluate the reduced models on nine scientific datasets, and the results show the effectiveness of our approaches.

Keyword:

High-performance computing data precondition data reduction reduced model

Author Community:

  • [ 1 ] [Luo, Huizhang]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 2 ] [Huang, Dan]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 3 ] [Liu, Qing]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 4 ] [Qiao, Zhenbo]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 5 ] [Zhou, Mengchu]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 6 ] [Wang, Jinzhen]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 7 ] [Qin, Zhenlu]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA
  • [ 8 ] [Jiang, Hong]Univ Texas Arlington, Dept Comp Sci & Engn, Arlington, TX 76019 USA
  • [ 9 ] [Bi, Jing]Beijing Univ Technol, Fac Informat Technol, Beijing, Peoples R China
  • [ 10 ] [Yuan, Haitao]Beijing Jiaotong Univ, Sch Software Engn, Beijing, Peoples R China

Reprint Author's Address:

  • [Luo, Huizhang]New Jersey Inst Technol, Dept Elect & Comp Engn, Newark, NJ 07102 USA

Email:

Show more details

Related Keywords:

Related Article:

Source :

2019 IEEE 33RD INTERNATIONAL PARALLEL AND DISTRIBUTED PROCESSING SYMPOSIUM (IPDPS 2019)

ISSN: 1530-2075

Year: 2019

Page: 293-302

Language: English

Cited Count:

WoS CC Cited Count: 7

SCOPUS Cited Count: 14

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 9

Online/Total:1227/10634873
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.