IEEE Access (Jan 2023)

Efficient Large-Capacity Caching in Cloud Storage Using Skip-Gram-Based File Correlation Analysis

  • Fang Xiao,
  • Siyuan Yu,
  • Yuze Li

DOI
https://doi.org/10.1109/ACCESS.2023.3322725
Journal volume & issue
Vol. 11
pp. 111265 – 111273

Abstract

Read online

Designing a high-capacity cache is an essential means of improving the accessibility of cloud storage. Compared with traditional data access, cloud storage data access presents new patterns, and traditional caching strategies cannot handle the prefetching and replacement of non-hot data very well. Numerous studies have shown that file correlation can optimize cloud storage’s caching and prefetching strategies. However, characterizing the correlation between files from multiple dimensions is quite complex, and the difficulty of optimizing cloud storage caching using file correlation increases accordingly. Based on the above shortcomings, this study designed a file similarity strategy based on skip-gram from the analysis of user access. This strategy completes the prefetching and replacing files in a high-capacity cache by judging the correlation between files. The strategy prefetches files and dynamically inserts them into the cache by judging the correlation between files. After using the prefetching strategy, we significantly improve the cache hit rate in the simulation benchmark. In addition, the strategy can establish an index table after each training completion, which consumes very little time during online operations. During training, the time required to establish the index is $O(N*log(V))$ , and the time for indexing is $O(1)$ .

Keywords