On the Evaluation of the Total-Cost-of-Ownership Trade-offs in Edge vs Cloud deployments: A Wireless-Denial-of-Service Case Study

    Research output: Contribution to journalArticle

    Early online date

    View graph of relations

    We are witnessing an explosive growth in the number of Internet-connected devices and the emergence of several new
    classes of Internet of Things (IoT) applications that require rapid processing of an abundance of data. To overcome the resulting need
    for more network bandwidth and low network latency, a new paradigm has emerged that promotes the offering of Cloud services at
    the Edge, closer to users. However, the Edge is a highly constrained environment with limited power budget for servers per Edge
    installation which, in turn, limits the number of Internet-connected devices, such as sensors, that an installation can service.
    Consequently, the limited number of sensors leads to a reduction in the area coverage provided by them and puts in question the
    effectiveness for deploying IoT applications at the Edge. In this paper, we investigate the benefits of running an emerging security
    focused IoT application, (jamming detection), at the Edge vs. the Cloud by developing a Total Cost of Ownership (TCO) model, which
    considers the application’s requirements as well as the Edge’s constraints. For the first time, we build such a model based on realistic
    performance and energy-efficiency measurements obtained from commodity 64-bit ARM based micro-servers that are excellent
    candidates for supporting Cloud services at the Edge. Such servers represent the type of devices that can provide the right balance
    between power and performance, without requiring any complicate cooling and power supply infrastructure, which will not be available
    at the de-centralized deployments. Aiming at improving the energy efficiency, we exploit the pessimistic design margins adopted
    conventionally in such devices and investigate their operation under lower than nominal supply voltage and memory refresh-rate. Our
    results show that the jamming detection application deployed at an Edge environment is superior to a Cloud based solution by up to
    2.13 times in terms of TCO. Moreover, when servers operate below nominal conditions, we can achieve up to 9% power savings
    which enables in several situations 100% gains in the TCO/area-coverage metric, i.e double area can be served with the same TCO.

    Documents

    DOI

    Original languageEnglish
    Number of pages13
    JournalIEEE Transactions on Sustainable Computing
    Journal publication date18 Jan 2019
    Early online date18 Jan 2019
    DOIs
    Publication statusEarly online date - 18 Jan 2019

    ID: 163613001