• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

WANG Xiaoxi (WANG Xiaoxi.) | WU Wenjun (WU Wenjun.) | YANG Feng (YANG Feng.) | SI Pengbo (SI Pengbo.) | ZHANG Xuanyi (ZHANG Xuanyi.) | ZHANG Yanhua (ZHANG Yanhua.)

Abstract:

With the emergence of various intelligent applications,machine learning technologies face lots of challenges including large-scale models,application oriented real-time dataset and limited capa-bilities of nodes in practice.Therefore,distributed machine learning(DML)and semi-supervised learning methods which help solve these problems have been addressed in both academia and indus-try.In this paper,the semi-supervised learning method and the data parallelism DML framework are combined.The pseudo-label based local loss function for each distributed node is studied,and the stochastic gradient descent(SGD)based distributed parameter update principle is derived.A demo that implements the pseudo-label based semi-supervised learning in the DML framework is conduc-ted,and the CIFAR-10 dataset for target classification is used to evaluate the performance.Experi-mental results confirm the convergence and the accuracy of the model using the pseudo-label based semi-supervised learning in the DML framework.Given the proportion of the pseudo-label dataset is 20%,the accuracy of the model is over 90%when the value of local parameter update steps be-tween two global aggregations is less than 5.Besides,fixing the global aggregations interval to 3,the model converges with acceptable performance degradation when the proportion of the pseudo-label dataset varies from 20%to 80%.

Keyword:

Author Community:

  • [ 1 ] [WANG Xiaoxi]北京工业大学
  • [ 2 ] [ZHANG Xuanyi]Beijing Capital International Airport Co.,Ltd.,Beijing 101317,P.R.China
  • [ 3 ] [WU Wenjun]北京工业大学
  • [ 4 ] [YANG Feng]北京工业大学
  • [ 5 ] [ZHANG Yanhua]北京工业大学
  • [ 6 ] [SI Pengbo]北京工业大学

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

高技术通讯(英文版)

ISSN: 1006-6748

Year: 2022

Issue: 2

Volume: 28

Page: 172-180

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count:

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count: -1

Chinese Cited Count:

30 Days PV: 5

Affiliated Colleges:

Online/Total:1315/10563597
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.