A Novel Attention Fully Convolutional Network Method for Synthetic Aperture Radar Image Segmentation

Zhenyu Yue, Fei Gao*, Qingxu Xiong, Jun Wang, Amir Hussain, Huiyu Zhou

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

4 Citations (Scopus)
5 Downloads (Pure)


As an important step of synthetic aperture radar image interpretation, synthetic aperture radar image segmentation aims at segmenting an image into different regions in terms of homogeneity. Because of the deficiency of the labeled samples and the existence of speckling noise, synthetic aperture radar image segmentation is a challenging task. We present a new method for synthetic aperture radar image segmentation in this article. Due to the large size of the original synthetic aperture radar image, we first divide the input image into small slices. Then the image slices are input to the attention-based fully convolutional network for obtaining the segmentation results. Finally, the fully connected conditional random field is adopted for improving the segmentation performance of the network. The innovations of our method are as follows: 1) The attention-based fully convolutional network is embedded with the multiscale attention network which is capable of enhancing the extraction of the image features through three strategies, namely, multiscale feature extraction, channel attention extraction, and spatial attention extraction. 2) We design a new loss function for the attention fully convolutional network by combining Lovasz-Softmax and cross-entropy losses. The new loss allows us to simultaneously optimize the intersection over union and the pixel classification accuracy of the segmentation results. The experiments are performed on two airborne synthetic aperture radar image databases. It has been proved that our method is superior to other state-of- the-art image segmentation approaches.

Original languageEnglish
Pages (from-to)4585-4598
Number of pages14
JournalIEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing
Publication statusPublished - 12 Aug 2020
Externally publishedYes

Bibliographical note

Funding Information:
Manuscript received May 9, 2020; revised June 21, 2020 and July 25, 2020; accepted August 9, 2020. Date of publication August 12, 2020; date of current version August 24, 2020. This work was supported in part by the National Natural Science Foundation of China under Grant 61771027, Grant 61071139, Grant 61471019, Grant 61501011, and Grant 61171122. The work of Amir Hussain was supported in part by the U.K. Engineering and Physical Sciences Research Council (EPSRC) under Grant EP/M026981/1. The work of Huiyu Zhou was supported in part by the Royal Society-Newton Advanced Fellowship under Grant NA160342. (Corresponding author: Fei Gao.) Zhenyu Yue, Fei Gao, Qingxu Xiong, and Jun Wang are with the School of Electronic and Information Engineering, Beihang University, Beijing 100191, China (e-mail: yuezhenyu@buaa.edu.cn; 08060@buaa.edu.cn; qxxiong@buaa.edu.cn; wangj203@buaa.edu.cn).

Publisher Copyright:
© 2008-2012 IEEE.

Copyright 2020 Elsevier B.V., All rights reserved.


  • Attention mechanism
  • conditional random field (CRF)
  • fully convolutional network (FCN)
  • image segmentation
  • synthetic aperture radar (SAR)

ASJC Scopus subject areas

  • Computers in Earth Sciences
  • Atmospheric Science


Dive into the research topics of 'A Novel Attention Fully Convolutional Network Method for Synthetic Aperture Radar Image Segmentation'. Together they form a unique fingerprint.

Cite this