Synthetic Aperture Radar (SAR) target recognition is an important research direction of SAR image interpretation. In recent years, most of machine learning methods applied to SAR target recognition are supervised learning which requires a large number of labeled SAR images. However, labeling SAR images is expensive and time-consuming. We hereby propose an end-to-end semi-supervised recognition method based on an attention mechanism and bias-variance decomposition, which focuses on the unlabeled data screening and pseudo-labels assignment. Different from other learning methods, the training set in each iteration is determined by a module that we here propose, called dataset attention module (DAM). Through DAM, the contributing unlabeled data will have more possibilities to be added into the training set, while the non-contributing and hard-to-learn unlabeled data will receive less attention. During the training process, each unlabeled data will be input into the network for prediction. The pseudo-label of the unlabeled data is considered to be the most probable classification in the multiple predictions, which reduces the risk of the single prediction. We calculate the prediction bias-and-variance of all the unlabeled data and use the result as the criteria to screen the unlabeled data in DAM. In this paper, we carry out semi-supervised learning experiments under different unlabeled rates on the Moving and Stationary Target Acquisition and Recognition (MSTAR) dataset. The recognition accuracy of our method is better than several state of the art semi-supervised learning algorithms.
Bibliographical noteFunding Information:
The MSTAR dataset was collected by the Sandia National Laboratory SAR sensor platform, which was jointly sponsored by the Defense Advanced Research Projects Agency and the Air Force Research Laboratory . The MSTAR data consists of X-band SAR images with 1-foot by 1-foot resolution. Ten classes of vehicle objects in the MSTAR dataset are chosen in our experiments, which are classified into three categories: artillery, truck, and tank. Artillery classes include 2S1 and ZSU234. Truck classes include BMP2, BRDM2, BTR60, BTR70, D7 and ZIL131. Tank classes include T62 and T72. The SAR and the corresponding optical images of each class are shown in Fig. 5.
This work was supported in part by the National Natural Science Foundation of China under Grant 61771027, Grant 61071139, Grant 61471019, Grant 61501011, and Grant 61171122. The work of A. Hussain was supported in part by the U.K. Engineering and Physical Sciences Research Council (EPSRC) under Grant EP/M026981/1. The work of H. Zhou was supported in part by the U.K. EPSRC under Grant EP/N508664/1, Grant EP/R007187/1, and Grant EP/N011074/1, and in part by the Royal Society-Newton Advanced Fellowship under Grant NA160342.
© 2013 IEEE.
Copyright 2019 Elsevier B.V., All rights reserved.
- Attention mechanism
- bias-variance decomposition
- SAR target recognition
- semi-supervised learning
ASJC Scopus subject areas
- Computer Science(all)
- Materials Science(all)