Edge Resource Optimization refers to the process of efficiently managing and allocating computational resources at the edge of a network, closer to data sources, such as IoT devices or edge servers. By processing data locally rather than sending it to centralized cloud servers, edge resource optimization reduces latency, bandwidth usage, and energy consumption. It involves balancing workloads, minimizing resource wastage, and ensuring high performance in real-time applications like autonomous vehicles, smart cities, and industrial automation. This optimization improves system efficiency, reduces costs, and enhances overall user experience.

Posts

EdgeSync: Efficient Edge-Assisted Video Analytics via Network Contention-Aware Scheduling

With the advancement of 5G, edge-assisted video analytics has become increasingly popular, driven by the technology’s ability to support low-latency, high-bandwidth applications. However, in scenarios where multiple clients competing for network resources, network contention poses a significant challenge. In this paper, we propose a novel scheduling algorithm that intelligently batches and aligns the offloading of multiple video analytics clients to optimize both network and edge server resource utilization while meeting the Service Level Objective (SLO). Experiment with a cellular network testbed shows that our approach successfully processes 93% or more of inference requests from 7 different clients to the edge server while meeting the SLOs, whereas other approaches achieve a lower success rate, ranging from 65% to 85% under the same condition.