Indexed by:
Abstract:
A great number of dimensionality reduction methods are finally reduced to solving generalized eigenvector problems. Optimization techniques are promising ways to solve the parameter selection problems in these dimensionality reduction methods. The most important step in these optimization methods is to compute the objective function with respect to the parameter, which depends on computing the gradient and Hessian matrix of the resulted eigenvectors and eigenvalues. In this paper, we propose a novel method to compute the gradient of the eigenvalues, and then apply them to tune the parameter in the kernel principal component analysis. Experimental results on UCI data sets show that the new method outperforms the original algorithm, especially in time complexity. © 2011 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
Year: 2011
Language: English
Cited Count:
WoS CC Cited Count: 0
SCOPUS Cited Count:
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 8
Affiliated Colleges: