• Complex
  • Title
  • Keyword
  • Abstract
  • Scholars
  • Journal
  • ISSN
  • Conference
搜索

Author:

Salem, M.H. (Salem, M.H..) | Li, Y. (Li, Y..) | Liu, Z. (Liu, Z..)

Indexed by:

EI Scopus

Abstract:

The state-of-the-art in computer vision has significantly improved over the last few years as a result of the rapid development of deep learning techniques, the existence of large labeled datasets like ImageNet, and GPU acceleration developments. With the fast advancement of pre-trained models, transfer learning and fine-tuning are becoming more popular strategies for saving time and solving the problem of a shortage of data in image processing. For ship recognition, it is still a big challenge to obtain large labeled datasets like ImageNet. In this paper, we experiment with classifying the Marvel vessel dataset by reducing the training samples by half using transfer learning and fine-tuning strategies based on pre-trained EfficientNet (B0B5) as a backbone network to reduce training time and complexity. We also trained the most famous Deep Convolution neural network architectures, ResNet-152 and InceptionV3, and then compared the different architectures' accuracy results for the selected samples of the Marvel dataset. We achieved a significant improvement in classification accuracy compared to the previous state-of-the-art results for the Maritime Vessel (Marvel) dataset using the EfficientNet B5 architecture, with the highest accuracy of 91.60%.  © 2022 IEEE.

Keyword:

Image classification marvel dataset EfficientNet deep learning transfer learning

Author Community:

  • [ 1 ] [Salem M.H.]Beijing University of Technology, Faculty of Information Technology, Beijing, China
  • [ 2 ] [Li Y.]Guilin University of Electronic Technology, School of Artificial Intelligence, Guilin, China
  • [ 3 ] [Liu Z.]Beijing University of Technology, Faculty of Information Technology, Beijing, China

Reprint Author's Address:

Email:

Show more details

Related Keywords:

Related Article:

Source :

Year: 2022

Page: 514-520

Language: English

Cited Count:

WoS CC Cited Count: 0

SCOPUS Cited Count: 2

ESI Highly Cited Papers on the List: 0 Unfold All

WanFang Cited Count:

Chinese Cited Count:

30 Days PV: 6

Affiliated Colleges:

Online/Total:948/10608039
Address:BJUT Library(100 Pingleyuan,Chaoyang District,Beijing 100124, China Post Code:100124) Contact Us:010-67392185
Copyright:BJUT Library Technical Support:Beijing Aegean Software Co., Ltd.