Indexed by:
Abstract:
Entering 2022, AIGC, represented by foundation model such as ChatGPT, GPT4, Sora, and GPT-4o, is breaking through the wall rapidly. Generative artificial intelligence foundation model technology is rapidly iterating and continuously evolving, becoming a revolutionary tool for content generation, knowledge production, and human-computer interaction. With the increasing number of parameters in foundation model and the increasing complexity of deep learning algorithms, the demand for computing power in foundation model is further increasing. Massive data and large-scale parameters result in extremely large computational loads, limited storage on a single computer server, and limited computer capabilities. Training a foundation model with billions of parameters requires tens of thousands of GPU cards for synchronous computation, and high-performance computing power networks have become the main method and means to meet the demand for large computing power.This paper summarizes the development process of intelligent computing, explores the demand for computing power under the background of large models, and analyzes computing power, computing power networks, and technology ecology based on this, and analyzes their related technologies. © 2024 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2024
Page: 245-252
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: