Softmax loss is arguably one of the most widely used loss functions in CNNs. In recent years some Softmax variants have been proposed to enhance the discriminative ability of the learned features by adding additional margin constraints, which significantly improved the state-of-the-art performance of face recognition. However, the ‘margin’ referenced in these losses does not represent the real margin between the different classes in the training set. Furthermore, they impose a margin on all possible combinations of class pairs, which is unnecessary. In this paper we propose the Precise Adjacent Margin loss (PAM loss), which gives an accurate definition of ‘margin’ and has precise operations appropriate for different cases. PAM loss has better geometrical interpretation than the existing margin-based losses. Extensive experiments are conducted on LFW, YTF, MegaFace and FaceScrub datasets, and results show that the proposed method has state-of-the-art performance.
|Name||2019 IEEE International Conference on Image Processing (ICIP)|
2019 IEEE International Conference on Image Processing (ICIP) ; Conference date: 22-09-2019 Through 25-09-2019