"INTEGRATING SERVERLESS AND EDGE COMPUTING: A FRAMEWORK FOR IMPROVED QO" by Nazanin Sarrafzadehghadimi
Electronic Thesis and Dissertation Repository

Thesis Format

Monograph

Degree

Master of Science

Program

Computer Science

Supervisor

Lutfiyya, Hanan

Abstract

With the launch of Google App Engine in 2008, serverless computing emerged as a popular cloud computing paradigm, simplifying application deployment by delegating infrastructure management to cloud providers. These providers handle server and resource management through automated provisioning and scaling. In addition, the ephemeral nature of serverless functions allows for the de-provisioning of resources and offers a granular pay-per-use pricing model, charging users only based on the invocations, which facilitates cost savings. This research explores the intersection of serverless and edge computing, leveraging the lower latency, reduced resource consumption, and improved energy efficiency of edge environments to enhance the performance of serverless functions and maintain service continuity in a Multi-access Edge Computing (MEC) environment. We propose a framework that proactively spawns multiple instances of functions based on predicted user movements, increasing solution reliability. To further optimize function deployment and relocation times, we introduce server selection criteria, a caching mechanism, and a distributed image registry to improve image pulling and layer sharing processes. Numerical results and experiments show that these strategies effectively reduce relocation times and frequency, lower energy consumption, and optimize network usage.

Summary for Lay Audience

In recent years, serverless computing has become a popular method for deploying applications in the cloud because it allows companies to run applications without worrying about managing the underlying infrastructure. Cloud providers take care of tasks like managing servers and scaling resources automatically. Additionally, serverless functions are temporary and only run when needed, allowing cloud providers to release resources when they are not in use. This means users are charged only for the time their code is actually running, which can significantly reduce costs. This research focuses on combining serverless computing with edge computing, which involves processing data closer to where it is generated, like on smartphones or local servers, rather than relying solely on distant data centers. By doing this, we can achieve faster response times, use fewer resources, and improve energy efficiency. Our proposed framework predicts the directions toward which users might move and creates multiple instances of serverless functions before the user moves away from the coverage area of a cell. This approach helps to ensure reliable service even when users change locations. Additionally, we introduce methods to optimize how serverless functions are deployed and relocated, such as choosing the best servers, caching data efficiently, and improving how images are stored and shared among serverless functions. Our experiments show that these strategies significantly reduce the time it takes to deploy and move functions, lower energy consumption, and make better use of network resources. This work helps improve the performance and reliability of applications running in environments where speed and efficiency are crucial.

Share

COinS