Three-dimensional complex permittivity prediction from spatio-temporal electric field distributions

Research output: Chapter in Book/Report/Conference proceedingConference contribution

Abstract

Quantitative imaging is a critical task in various applications, including medical diagnostics, non-destructive evaluation and material characterisation. This work addresses the problem of three-dimensional (3D) permittivity profile reconstruction of a region of interest (RoI) by leveraging a deep learning model. This is performed by training the model to establish a mapping between the spatio-temporal electric field data and the corresponding permittivity profile of the RoI. The deep learning model is designed to handle the time-varying electric field data efficiently, using a combination of convolutional layers to extract spatial features and Long Short-Term Memory (LSTM) networks, to model temporal dependencies. The effectiveness of this approach is demonstrated through comprehensive validations in different scenarios, offering a powerful tool for quantitative permittivity prediction in a wide range of applications.

Original languageEnglish
Title of host publicationProceedings of the 19th European Conference on Antennas and Propagation, EuCAP 2025
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages5
Publication statusAccepted - 31 Dec 2024
Event19th European Conference on Antennas and Propagation 2025 - Stockholm, Sweden
Duration: 30 Mar 202504 Apr 2025
https://eucap.org/

Publication series

NameEuCAP Proceedings
ISSN (Print)2164-3342

Conference

Conference19th European Conference on Antennas and Propagation 2025
Abbreviated titleEuCAP 2025
Country/TerritorySweden
CityStockholm
Period30/03/202504/04/2025
Internet address

Keywords

  • quantitative imaging
  • permittivity
  • deep learning
  • inverse problem

Fingerprint

Dive into the research topics of 'Three-dimensional complex permittivity prediction from spatio-temporal electric field distributions'. Together they form a unique fingerprint.

Cite this