Indexed by:
Abstract:
Activation functions play an important role in deep learning and its choice has a significant effect on the training and performance of a model. In this study, a new variant of Exponential Linear Unit (ELU) activation called Transformed Exponential Linear Unit (TELU) is proposed. An empirical evaluation is done to determine the effectiveness of the new activation function using state-of-the-art deep learning architectures. From the experiments, TELU activation function tends to work better than the conventional activations functions on deep models across a number of benchmarking datasets. TELU achieves superior classification accuracy on Cifar-10, SVHN and Caltech-101 dataset on state-of-the-art deep learning models. Additionally, it shows superior AUROC, MCC, and F1-score on the STL-10 dataset. This proves that TELU can be successfully applied in deep learning for image classification.
Keyword:
Reprint Author's Address:
Email:
Source :
PROCEEDINGS OF 2019 INTERNATIONAL CONFERENCE ON IMAGE, VIDEO AND SIGNAL PROCESSING (IVSP 2019)
Year: 2019
Page: 55-62
Language: English
Cited Count:
WoS CC Cited Count: 1
SCOPUS Cited Count: 3
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 3
Affiliated Colleges: