Indexed by:
Abstract:
The sparse subspace clustering problem is to group a set of data into their underlying subspaces and correct the underlying noise simultaneously. It was shown in the recent literature that, the clustering task can be characterized as a block diagonal matrix regularized nonconvex minimization problem. However, this problem is not easy to solve because it contains a nonconvex bilinear function. The earliest method named block diagonal regularization (BDR) only solved a penalized model, but not the original problem itself. The recently algorithm named accelerated block coordinated gradient descent (ABCGD) can solve the original problem efficiently, but its convergence is not given. In this paper, we attempt to use an accelerated gradient method (AGM), and establish its convergence in the sense of converging to a critical point with a certain stepsize policy. We show that closed-form solutions are enjoyed for each subproblem by taking full use of the constraints' structure so that the algorithm is easily implementable. Finally, we do numerical experiments by the using of two real datasets. The numerical results illustrate that the proposed algorithm AGM performs better than BDR and ABCGD evidently.
Keyword:
Reprint Author's Address:
Source :
PACIFIC JOURNAL OF OPTIMIZATION
ISSN: 1348-9151
Year: 2022
Issue: 2
Volume: 18
Page: 265-280
0 . 2
JCR@2022
0 . 2 0 0
JCR@2022
ESI Discipline: ENGINEERING;
ESI HC Threshold:49
JCR Journal Grade:4
CAS Journal Grade:4
Cited Count:
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: