DeFog: Fog Computing Benchmarks

Jonathan McChesney, Nan Wang, Ashish Tanwer, Eyal de Lara, Blesson Varghese

Research output: Chapter in Book/Report/Conference proceedingConference contribution

65 Citations (Scopus)
271 Downloads (Pure)


Fog computing envisions that deploying services of an application across resources in the cloud and those located at the edge of the network may improve the overall performance of the application when compared to running the application on the cloud. However, there are currently no benchmarks that can directly compare the performance of the application across the cloud-only, edge-only and cloud-edge deployment platform to obtain any insight on performance improvement. This paper proposes DeFog, a first Fog benchmarking suite to: (i) alleviate the burden of Fog benchmarking by using a standard methodology, and (ii) facilitate the understanding of the target platform by collecting a catalogue of relevant metrics for a set of benchmarks. The current portfolio of DeFog benchmarks comprises six relevant applications conducive to using the edge. Experimental studies are carried out on multiple target platforms to demonstrate the use of DeFog for collecting metrics related to application latencies (communication and computation), for understanding the impact of stress and concurrent users on application latencies, and for understanding the performance of deploying different combination of services of an application across the cloud and edge. DeFog is available for public download (
Original languageEnglish
Title of host publicationSEC 2019 : The Fourth ACM/IEEE Symposium on Edge Computing
PublisherAssociation for Computing Machinery
Number of pages12
Publication statusPublished - 01 Nov 2019


Dive into the research topics of 'DeFog: Fog Computing Benchmarks'. Together they form a unique fingerprint.

Cite this