Indexed by:
Abstract:
Arbitrary style transfer means that stylized images can be generated from a set of arbitrary input image pairs of content images and style images. Recent arbitrary style transfer algorithms lead to distortion of content or incompletion of style transfer because network need to make a balance between the content structure and style. In this paper, we introduce a dual attention network based on style attention and channel attention, which can flexibly transfer local styles, pay more attention to content structure, keep content structure intact and reduce unnecessary style transfer. Experimental results show that the network can synthesize high quality stylized images while maintaining real-time performance. © 2021 IEEE.
Keyword:
Reprint Author's Address:
Email:
Source :
ISSN: 2693--2814
Year: 2021
Page: 1221-1226
Language: English
Cited Count:
SCOPUS Cited Count: 3
ESI Highly Cited Papers on the List: 0 Unfold All
WanFang Cited Count:
Chinese Cited Count:
30 Days PV: 7
Affiliated Colleges: