Abstract:
The combination of service virtualization and edge computing allows mobile users to enjoy low latency services, while keeping data storage and processing local. However, the network edge has limited resource availability, and when both virtualized user applications and network functions need to be supported concurrently, a natural conflict in resource usage arises. In this paper, we focus on computing and radio resources and develop a framework for resource orchestration at the edge that leverages a model-free reinforcement learning approach and a Pareto analysis, which is proved to make fair and efficient decisions. Through our testbed, we demonstrate the effectiveness of our solution in resource-limited scenarios, and show an improvement of around 60% in the CPU budget violation rate with respect to RL based standard multi-agent framework.