Indexed by:
Abstract:
Boosting has been extensively used in image processing. Many work focuses on the design or the usage of boosting, but training boosting on large-scale datasets tends to be ignored. To handle the large-scale problem, we present stochastic boosting (StocBoost) that relies on stochastic gradient descent (SGD) which uses one sample at each iteration. To understand the efficacy of StocBoost, the convergence of training algorithm is theoretically analyzed. Experimental results show that StocBoost is faster than the batch ones, and is also comparable with the state-of-the-arts. © 2013 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2013
Page: 3274-3277
Language: English
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 6