Abstract
The use of natural phenomena as inspiration to address real-life problems has become an increasingly popular research approach. In the medical domain, generative adversarial networks (GANs) have shown promise results. However, GANs often struggle to deal with complex medical images, particularly in histopathology, where capturing the boundaries of nuclei and contours is challenging. To address these challenges, we propose SaltGAN, a three-phase model which is inspired by the natural properties of salt which are preservation, disinfection, and flavoring. These three-phases include an adversarial neural network with a generator and a discriminator, a convolutional neural network (CNN) for feature extraction, and a layer-wise skip connection positioned between the CNN network and the generator network. We also propose an improved loss function and a ranking system using look-back-look-up (LBLU) and multi-metric approach (MMA) for preserving checkpoints and evaluating performance. The SaltGAN allows for infusing discriminant features to generator and applies a novel loss function with provision for monitoring checkpoints during training. We evaluated SaltGAN on the publicly available BreakHis dataset using feature-based, reference-based, and non-reference-based metrics, including Frechet Inception Distance (FID), natural image quality evaluator (NIQE), and feature similarity indexing method (FSIM). Our results demonstrate that SaltGAN outperforms other state-of-the-art models like EOSA-GAN, confirming the applicability of using natural phenomena of salt properties to address GAN challenges in histopathology images. Our study demonstrates the potential of natural inspiration in the design of computational solutions for real-life problems. Source code for the SaltGAN can be accessed from https://github.com/NathanielOy/SaltGAN.
Original language | English |
---|---|
Article number | 106467 |
Number of pages | 17 |
Journal | Biomedical Signal Processing and Control |
Volume | 95 |
Issue number | Part B |
Early online date | 22 May 2024 |
DOIs | |
Publication status | Published - Sept 2024 |