Back to Search Start Over

Optimizing storage on fog computing edge servers: A recent algorithm design with minimal interference.

Authors :
Zhao, Xumin
Xie, Guojie
Luo, Yi
Chen, Jingyuan
Liu, Fenghua
Bai, HongPeng
Source :
PLoS ONE. 7/10/2024, Vol. 19 Issue 7, p1-21. 21p.
Publication Year :
2024

Abstract

The burgeoning field of fog computing introduces a transformative computing paradigm with extensive applications across diverse sectors. At the heart of this paradigm lies the pivotal role of edge servers, which are entrusted with critical computing and storage functions. The optimization of these servers' storage capacities emerges as a crucial factor in augmenting the efficacy of fog computing infrastructures. This paper presents a novel storage optimization algorithm, dubbed LIRU (Low Interference Recently Used), which synthesizes the strengths of the LIRS (Low Interference Recency Set) and LRU (Least Recently Used) replacement algorithms. Set against the backdrop of constrained storage resources, this research endeavours to formulate an algorithm that optimizes storage space utilization, elevates data access efficiency, and diminishes access latencies. The investigation initiates a comprehensive analysis of the storage resources available on edge servers, pinpointing the essential considerations for optimization algorithms: storage resource utilization and data access frequency. The study then constructs an optimization model that harmonizes data frequency with cache capacity, employing optimization theory to discern the optimal solution for storage maximization. Subsequent experimental validations of the LIRU algorithm underscore its superiority over conventional replacement algorithms, showcasing significant improvements in storage utilization, data access efficiency, and reduced access delays. Notably, the LIRU algorithm registers a 5% increment in one-hop hit ratio relative to the LFU algorithm, a 66% enhancement over the LRU algorithm, and a 14% elevation in system hit ratio against the LRU algorithm. Moreover, it curtails the average system response time by 2.4% and 16.5% compared to the LRU and LFU algorithms, respectively, particularly in scenarios involving large cache sizes. This research not only sheds light on the intricacies of edge server storage optimization but also significantly propels the performance and efficiency of the broader fog computing ecosystem. Through these insights, the study contributes a valuable framework for enhancing data management strategies within fog computing architectures, marking a noteworthy advancement in the field. [ABSTRACT FROM AUTHOR]

Details

Language :
English
ISSN :
19326203
Volume :
19
Issue :
7
Database :
Academic Search Index
Journal :
PLoS ONE
Publication Type :
Academic Journal
Accession number :
178360177
Full Text :
https://doi.org/10.1371/journal.pone.0304009