• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Bi, J. (Bi, J..) | Ma, H. (Ma, H..) | Yuan, H. (Yuan, H..) | Buyya, R. (Buyya, R..) | Yang, J. (Yang, J..) | Zhang, J. (Zhang, J..) | Zhou, M. (Zhou, M..)

Indexed by:

EI Scopus SCIE

Abstract:

Resource usage prediction in cloud data centers is critically important. It can improve providers’ service quality and avoid resource wastage and insufficiency. However, the time series of resource usage in cloud environments is characterized by multidimensional, nonlinear, and high-volatility characteristics. Achieving high-accuracy prediction for time series with such characteristics is necessary but difficult. Traditional prediction methods based on regression algorithms and recurrent neural networks cannot effectively extract non-linear features from datasets. Besides, many deep learning models suffer from gradient explosion or gradient vanishing during the training stage. Current commonly used prediction methods fail to uncover some vital information about the frequency domain features in the time series. To resolve these challenges, we design a Forecasting method based on the Integration of a Savitzky-Golay (SG) filter, a Frequency Enhanced Decomposed Transformer (FEDformer) model, and a Frequency-Enhanced channel Attention mechanism named FISFA. It adopts the SG filter to reduce noise and smooth sequences in the raw sequences of resources. Then, we develop a hybrid transformer-based model integrating FEDformer and the frequency-enhanced channel attention mechanism, effectively capturing the frequency domain patterns. Besides, a meta-heuristic optimization algorithm, i.e., genetic simulated annealing-based particle swarm optimizer, is proposed to optimize key hyperparameters of FISFA. Then, FISFA predicts the future needs for multi-dimensional resources in highly fluctuating traces in real-life cloud environments. Experimental results demonstrate that FISFA achieves higher accuracy and performs more efficient prediction than several benchmark forecasting methods with realistic datasets collected from Alibaba and Google cluster traces. FISFA improves the prediction accuracy on average by 32.14%, 25.49%, and 27.71% over vanilla LSTM, Transformer, and Informer methods, respectively. IEEE

Keyword:

Cloud computing deep learning Predictive models Time series analysis Long short term memory Transformers SG filter frequency enhancement time series prediction Feature extraction Forecasting

Author Community:

  • [ 1 ] [Bi J.]School of Software Engineering in the Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 2 ] [Ma H.]School of Software Engineering in the Faculty of Information Technology, Beijing University of Technology, Beijing, China
  • [ 3 ] [Yuan H.]School of Automation Science and Electrical Engineering, Beihang University, Beijing, China
  • [ 4 ] [Buyya R.]School of Computing and Information Systems, Cloud Computing and Distributed Systems (CLOUDS) Lab, University of Melbourne, Melbourne, VIC, Australia
  • [ 5 ] [Yang J.]CSSC Systems Engineering Research Institute, Beijing, China
  • [ 6 ] [Zhang J.]Department of Computer Science, Southern Methodist University, Dallas, TX, USA
  • [ 7 ] [Zhou M.]Department of Electrical and Computer Engineering, New Jersey Institute of Technology, Newark, NJ, USA

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Source :

IEEE Internet of Things Journal

ISSN: 2327-4662

Year: 2024

Issue: 15

Volume: 11

Page: 1-1

1 0 . 6 0 0

JCR@2022

Cited Count:

WoS CC Cited Count:

SCOPUS Cited Count: 6

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 7

Affiliated Colleges:

Online/Total:1242/10543111
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.