Indexed by:
Abstract:
This paper proposes a learning algorithm called Semi-supervised Online Sequential ELM, denoted as SOS-ELM. It aims to provide a solution for streaming data applications by learning from just the newly arrived observations, called a chunk. In addition, SOS-ELM can utilize both labeled and unlabeled training data by combining the advantages of two existing algorithms: Online Sequential ELM (OS-ELM) and Semi-Supervised ELM (SS-ELM). The rationale behind our algorithm exploits an optimal condition to alleviate empirical risk and structure risk used by SS-ELM, in combination with block calculation of matrices similar to OS-ELM. Efficient implementation of the SOS-ELM algorithm is made viable by an additional assumption that there is negligible structural relationship between chunks from different times. Experiments have been performed on standard benchmark problems for regression, balanced binary classification, unbalanced binary classification and multi-class classification by comparing the performance of the proposed SOS-ELM with OS-ELM and SS-ELM. The experimental results show that the SOS-ELM outperforms OS-ELM in generalization performance with similar training speed, and in addition outperforms SS-ELM with much lower supervision overheads. (C) 2015 Elsevier B.V. All rights reserved.
Keyword:
Reprint Author's Address:
Source :
NEUROCOMPUTING
ISSN: 0925-2312
Year: 2016
Volume: 174
Page: 168-178
6 . 0 0 0
JCR@2022
ESI Discipline: COMPUTER SCIENCE;
ESI HC Threshold:167
CAS Journal Grade:3
Cited Count:
WoS CC Cited Count: 40
SCOPUS Cited Count: 42
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 4
Affiliated Colleges: