On the Evaluation of the Total-Cost-of-Ownership Trade-offs in Edge vs Cloud deployments: A Wireless-Denial-of-Service Case Study

Panagiota Nikolaou, Yiannakis Sazeides, Alejandro Lampropulos, Denis Guilhot, Andrea Bartoli, George Papadimitriou, Athanasios Chatzidimitriou, Dimitris Gizopoulos, Konstantinos Tovletoglou, Lev Mukhanov, Georgios Karakonstantis

Research output: Contribution to journalArticlepeer-review

372 Downloads (Pure)


We are witnessing an explosive growth in the number of Internet-connected devices and the emergence of several new classes of Internet of Things (IoT) applications that require rapid processing of an abundance of data. To overcome the resulting need for more network bandwidth and low network latency, a new paradigm has emerged that promotes the offering of Cloud services at the Edge, closer to users. However, the Edge is a highly constrained environment with limited power budget for servers per Edge installation which, in turn, limits the number of Internet-connected devices, such as sensors, that an installation can service. Consequently, the limited number of sensors leads to a reduction in the area coverage provided by them and puts in question the effectiveness for deploying IoT applications at the Edge. In this paper, we investigate the benefits of running an emerging security focused IoT application, (jamming detection), at the Edge vs. the Cloud by developing a Total Cost of Ownership (TCO) model, which considers the application’s requirements as well as the Edge’s constraints. For the first time, we build such a model based on realistic performance and energy-efficiency measurements obtained from commodity 64-bit ARM based micro-servers that are excellent candidates for supporting Cloud services at the Edge. Such servers represent the type of devices that can provide the right balance between power and performance, without requiring any complicate cooling and power supply infrastructure, which will not be available at the decentralized deployments. Aiming at improving the energy efficiency, we exploit the pessimistic design margins adopted conventionally in such devices and investigate their operation under lower than nominal supply voltage and memory refresh-rate. Our results show that the jamming detection application deployed at an Edge environment is superior to a Cloud based solution by up to
2.13 times in terms of TCO. Moreover, when servers operate below nominal conditions, we can achieve up to 9% power savings which enables in several situations 100% gains in the TCO/area-coverage metric, i.e double area can be served with the same TCO.
Original languageEnglish
Number of pages13
JournalIEEE Transactions on Sustainable Computing
Early online date18 Jan 2019
Publication statusEarly online date - 18 Jan 2019


Dive into the research topics of 'On the Evaluation of the Total-Cost-of-Ownership Trade-offs in Edge vs Cloud deployments: A Wireless-Denial-of-Service Case Study'. Together they form a unique fingerprint.

Cite this