Optimize caching performance with a hybrid data caching method
DOI: 10.31673/2412-9070.2025.027500
DOI:
https://doi.org/10.31673/2412-9070.2025.027500Abstract
Data caching plays a key role in improving performance and speed of access to frequently requested resources. The use of different types and methods of caching is crucial for optimal system performance and reliability.
Data caching is used to store frequently used resources in RAM, on disk, or in hybrid systems that combine both approaches. The use of RAM provides high speed data access, while disk caching allows you to store larger amounts of data. Hybrid systems combine the advantages of both methods, achieving a balance between speed and storage capacity.
Particular attention is paid to caching algorithms that provide efficient data management in the cache. Popular algorithms such as Least Recently Used (LRU), Least Frequently Used (LFU), First In, First Out (FIFO), Adaptive Replacement Cache (ARC), and Most Recently Used (MRU) are considered. These algorithms are analyzed in the context of their application to optimize the performance of caching systems.
The advantages and disadvantages of using these algorithms in different scenarios are investigated. Attention is paid to solving the problems of optimizing the cache size, reducing delays in accessing data, and increasing the efficiency of resource use. Mathematical models and methods for analyzing caching performance are proposed to evaluate the effectiveness of various algorithms and optimize the settings of caching systems to achieve maximum performance.
A hybrid caching method is proposed and implemented that combines the LRU and MRU algorithms by dynamically switching between them based on the analysis of the variance of data access frequency. This approach involves calculating the statistical characteristics of data access, which allows the system to adaptively select the most appropriate caching algorithm in real time. Using this method has increased caching performance and efficiency, reduced the number of cache misses, and improved overall system throughput.
Keywords: data caching, performance, information systems, caching algorithms, hybrid systems, distributed caching, local caching, optimization, performance analysis.