dc.description.abstract |
Edge Computing is evolving as an enabler for providing critical requirements such as low latency, faster response time for applications like AR/VR, autonomous vehicles, patient monitoring, gaming, etc. In the IoT domain, these critical requirements are fulfilled by integrating edge computing with the host network to provide computation at the edge of the network. Researchers are now focusing on optimizing task offloading in terms of cost and the speed of task completion. There have been mathematical models proposed in the literature to estimate the optimal offloading points based on the parameters such as data size, link bandwidth, the processing speed of the end device and edge server, and network delay. In this paper, we propose a simulation-based mechanism to obtain optimal values for the offloading points. We use parametric analysis and show that the offloading points calculated using the mathematical models proposed deviate considerably from the actual values. |
en_US |