Indexed by:
Abstract:
Mathematical derivatives can be approximated or calculated by the techniques including symbolic differentiation, divided difference, and automatic differentiation etc. Automatic differentiation (AD) can compute fast and accurate derivatives such as the Jacobian, Hessian matrix and the tensor of the function. One of the most important applications is to improve the optimization algorithms by computing the relevant derivative information efficiently. In this paper, AD algorithms computing the Hessian and tensor terms are given, and their computational complexity is investigated Furthermore, they are applied to Chebyshev's method, which includes the evaluation of the tensor terms. The experiment results show that AD can be used efficiently in the optimization methods.
Keyword:
Reprint Author's Address:
Email:
Source :
ICNC 2008: FOURTH INTERNATIONAL CONFERENCE ON NATURAL COMPUTATION, VOL 1, PROCEEDINGS
Year: 2008
Page: 304-,
Language: English
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 2
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 0
Affiliated Colleges: