ROOM: adversarial machine learning attacks under real-time constraints

Amira Guesmi, Khaled N. Khasawneh, Nael Abu-Ghazaleh, Ihsen Alouani

Research output: Chapter in Book/Report/Conference proceedingConference contribution

1 Downloads (Pure)


Advances in deep-learning have enabled a wide range of promising applications. However, these systems are vulnerable to adversarial attacks; adversarially crafted pertur-bations to their inputs could cause them to misclassify. Most state-of-the-art adversarial attack generation algorithms focus primarily on controlling the noise magnitude to make it undetectable. The execution time is a secondary consideration for these attacks and the underlying assumption is that there are no time constraints. However, just-in-time adversarial attacks where an attacker opportunistically generates adversarial examples on-the-fly represent an even more critical threat, especially against real-time applications. Therefore, this paper introduces a new problem: how to systematically generate adversarial noise under real-time constraints? Understanding this problem improves our understanding of the threat these attacks pose to real-time systems and provides security evaluation benchmarks for future defenses. Therefore, first, we conduct a run-time analysis of adversarial generation algorithms. Our analysis show that universal attacks produce a general attack offline, with no online overhead. However, their success rate is limited because of their generality. In contrast, online algorithms, which target a specific input, are computationally expensive, making them inappropriate under time constraints. Thus, we propose ROOM, a novel Real-time Online-Offline attack construction Model where an offline component warms up the online algorithm, making it possible to generate highly successful attacks under time constraints. Our results show that ROOM can achieve high attack success rates under real-time constraints with up to 90x faster adversarial attack generation than state-of-the-art methods. For example, ROOM achieves 100% adversarial attack success rate on MNIST with a throughput of up to 1250 frame per second (FPS), more than 60% success rate with 200 FPS on CIFAR-10 and 60% with 16 FPS.

Original languageEnglish
Title of host publicationProceedings of the International Joint Conference on Neural Networks, IJCNN 2022
PublisherInstitute of Electrical and Electronics Engineers Inc.
Number of pages10
ISBN (Electronic)9781728186719
ISBN (Print)9781665495264
Publication statusPublished - 30 Sep 2022
Externally publishedYes
EventInternational Joint Conference on Neural Networks - Padua, Italy
Duration: 18 Jul 202223 Jul 2022

Publication series

NameInternational Joint Conference on Neural Networks: Proceedings
ISSN (Print)2161-4393
ISSN (Electronic)2161-4407


ConferenceInternational Joint Conference on Neural Networks
Abbreviated titleIJCNN
Internet address


Dive into the research topics of 'ROOM: adversarial machine learning attacks under real-time constraints'. Together they form a unique fingerprint.

Cite this