AbstractCurrent computing techniques using the Cloud as a centralised server will become untenable as billions of devices get connected to the Internet. This will lead to the degradation of the Quality-of-Service (QoS) of Cloud-hosted applications. Recently Edge computing is proposed as a potential solution, which leverages computing at the edge of the network on nodes, such as routers, base stations, and switches, along with the Cloud. To improve the QoS of Cloud applications through using Edge computing, the challenge of managing compute resources at both the Cloud and the Edge will need to be addressed.
This thesis demonstrates how the QoS of Cloud applications can be improved by utilising three enabling techniques for Edge computing, namely Edge computing as a service, flexible application deployment, and post-deployment resource management in a Cloud-Edge system.
Firstly, this thesis demonstrates a Cloud-Edge system involving three tiers, namely the Cloud, the Edge, and the End Device. The interactions between these tiers are implemented by presenting the Edge service workflow, which covers from the creation of Edge applications to the establishment of Edge services, and to the termination of Edge services. This technique makes it possible to offload computation from the Cloud to the Edge, and further leads to the question of how much of an application should be offloaded in a Cloud-Edge system.
Secondly, this thesis demonstrates a decision-making technique that dynamically decides on the appropriate amount of computation to be offloaded in the Cloud-Edge system. This is realised through a dynamic distribution mechanism that takes the context of the Edge environment (e.g. the resource availability of the Edge node) into consideration, before selecting a number of modules of a modular Cloud application to offload from the Cloud to the Edge. This technique explores the trade-offs between the QoS of an application and the running cost of utilising Edge services, which is useful to manage individual applications in a Cloud-Edge system.
Finally, to extend the resource management solutions in this thesis to manage multiple applications in a Cloud-Edge system, this thesis presents a priority-based dynamic vertical scaling mechanism to share Edge resources among multiple Edge applications. To enable dynamic vertical scaling, one static and three dynamic priority management approaches that are workload-aware, community-aware and systemaware, respectively are proposed.
The benefits of the three enabling techniques proposed in this thesis are that: (i) they are lightweight and can be executed in a short time; (ii) they can effectively help to improve the QoS of the chosen use cases – a location-based mobile game and a real-time face detection application.
|Date of Award||Jul 2020|
|Sponsors||Queen's University Belfast|
|Supervisor||Blesson Varghese (Supervisor), Dimitrios Nikolopoulos (Supervisor) & Michalis Matthaiou (Supervisor)|
- Edge computing
- resource management
- distributed systems