Optimal classifier for an ML-assisted resource allocation in wireless communications

Rashika Raina, David E. Simmons, Nidhi Simmons, Michel Daoud Yacoub

Research output: Contribution to journalArticlepeer-review

2 Downloads (Pure)

Abstract

This letter advances on the outage probability (OP) performance of a machine learning (ML)-assisted single-user multi-resource system. We focus on OP optimality and the trade-off between outage improvement and the mean number of resources scanned until a suitable resource is captured. We first present expressions for the OP of this system, complemented by an outage loss function (OLF) for its minimization. We then derive: (i) the necessary and sufficient properties of an optimal model (OpM) and (ii) expressions for the average number of resources scanned by both OpM and non-OpMs. Here, non-OpMs refer to those trained with the OLF and binary cross entropy (BCE) loss functions. We establish that optimal performance requires a channel that exhibits no time decorrelation properties. For very high decorrelation values, we find that models trained using the OLF and BCE perform similarly. For intermediate (practical) decorrelation values, OLF outperforms BCE, and both approach the OpM as decorrelation tends to zero. Our analysis further reveals that, to be able to capture a suitable resource, models trained with the OLF scan a slightly higher number of resources than the OpM and those trained with BCE. This increase in the mean number of scanned resources is offset by a significant enhancement in the OP as compared to BCE.
Original languageEnglish
JournalIEEE Networking Letters
Early online date04 Oct 2024
DOIs
Publication statusEarly online date - 04 Oct 2024

Fingerprint

Dive into the research topics of 'Optimal classifier for an ML-assisted resource allocation in wireless communications'. Together they form a unique fingerprint.

Cite this