RGB-2-hyper-spectral image reconstruction for food science using encoder/decoder neural architectures

Research output: Chapter in Book/Report/Conference proceedingConference contribution


Hyper-spectral imaging captures spatial and spectral information of a subject. This is used for the identification of substances within a scene, and food analysis. Presented is an investigation into the capabilities of encoder/decoder deep learning architectures for hyper-spectral image reconstruction from RGB images. For this analysis state-of-the-art (SOTA)techniques for hyper-spectral image reconstruction and other architectures from different fields have been used. Our approach examines a food science case study, using a CPU-based server and different accelerators. An in-house multi-sensor setup was used to capture the dataset which contains hyper-spectral images of twenty slices of different Spanish Ham in the range of 400-100 nm and their analogous RGB images. The results show no degradation in the output when moving outside of the visual range. This study shows that the SOTA methods for reconstructing from RGB do not produce the most accurate reconstruction of the spectral domain within the range of 400-1000 nm.

Original languageEnglish
Title of host publicationProceedings of the 2023 IEEE Symposium on Computers and Communications (ISCC)
PublisherInstitute of Electrical and Electronics Engineers Inc.
ISBN (Electronic)9798350300482
ISBN (Print)9798350300499
Publication statusPublished - 28 Aug 2023
Event28th IEEE Symposium on Computers and Communications 2023 - Tunis, Tunisia
Duration: 09 Jul 202312 Jul 2023

Publication series

NameIEEE Symposium on Computers and Communications: Proceedings
ISSN (Print)1530-1346
ISSN (Electronic)2642-7389


Conference28th IEEE Symposium on Computers and Communications 2023
Abbreviated titleISCC
Internet address


Dive into the research topics of 'RGB-2-hyper-spectral image reconstruction for food science using encoder/decoder neural architectures'. Together they form a unique fingerprint.

Cite this