Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Best way to implement LRU cache

Tags:

java

c++

caching

I was looking at this problem of LRU cache implementation where after the size of the cache is full, the least recently used item is popped out and it is replaced by the new item.

I have two implementations in mind:

1). Create two maps which looks something like this

std::map<timestamp, k> time_to_key
std::map<key, std::pair<timestamp, V>> LRUCache

To insert a new element, we can put the current timestamp and value in the LRUCache. While when the size of the cache is full, we can evict the least recent element by finding the smallest timestamp present in time_to_key and removing the corresponding key from LRUCache. Inserting a new item is O(1), updating the timestamp is O(n) (because we need to search the k corresponding to the timestamp in time_to_key.

2). Have a linked list in which the least recently used cache is present at the head and the new item is added at the tail. When an item arrives which is already present in the cache, the node corresponding to the key of that item is moved to the tail of the list. Inserting a new item is O(1), updating the timestamp is again O(n) (because we need to move to the tail of the list), and deleting an element is O(1).

Now I have the following questions:

  1. Which one of these implementations is better for an LRUCache.

  2. Is there any other way to implement the LRU Cache.

  3. In Java, should I use HashMap to implement the LRUCache

  4. I have seen questions like, implement a generic LRU cache, and also have seen questions like implementing an LRU cache. Is generic LRU cache different from LRU cache?

Thanks in advance!!!

EDIT:

Another way (easiest way) to implement LRUCache in Java is by using LinkedHashMap and by overriding the boolean removeEldestEntry(Map.entry eldest) function.

like image 840
Amm Sokun Avatar asked Jun 18 '11 21:06

Amm Sokun


People also ask

What is LRU cache How will you implement it?

A Least Recently Used (LRU) Cache organizes items in order of use, allowing you to quickly identify which item hasn't been used for the longest amount of time. Picture a clothes rack, where clothes are always hung up on one side. To find the least-recently used item, look at the item on the other end of the rack.

How do you create LRU cache which collection will you use?

We can create an instance of the LRUCache with a specific size. In this implementation, we use HashMap collection for storing all references to LinkedListNode.

Is LRU better than random?

LRU and 2-random are pretty close, with LRU edging out 2-random for the smaller caches and 2-random edging out LRU for the larger caches. As we might expect, LRU does worse than 2-random when the miss rates are high, and better when the miss rates are low.

How do you implement cache?

So the standard way to implement cache is to have a data structure, using which we can access value by a given key in constant time. Now all good, we can save key value pairs in memory and retrieve it whenever we need it.


2 Answers

If you want an LRU cache, the simplest in Java is LinkedHashMap. The default behaviour is FIFO however you can changes it to "access order" which makes it an LRU cache.

public static <K,V> Map<K,V> lruCache(final int maxSize) {
    return new LinkedHashMap<K, V>(maxSize*4/3, 0.75f, true) {
        @Override
        protected boolean removeEldestEntry(Map.Entry<K, V> eldest) {
            return size() > maxSize;
        }
    };
}

Note: I have using the constructor which changes the collection from newest first to most recently used first.

From the Javadoc

public LinkedHashMap(int initialCapacity,
                     float loadFactor,
                     boolean accessOrder)
Constructs an empty LinkedHashMap instance with the specified initial capacity, load factor and ordering mode.
Parameters:
initialCapacity - the initial capacity
loadFactor - the load factor
accessOrder - the ordering mode - true for access-order, false for insertion-order

When accessOrder is true the LinkedHashMap re-arranges the order of the map whenever you get() an entry which is not the last one.

This way the oldest entry is the least which is recently used.

like image 103
Peter Lawrey Avatar answered Sep 28 '22 09:09

Peter Lawrey


Normally, LRU caches are represented as LIFO structures- a single queue of elements. If the one provided by your Standard doesn't allow you to remove objects from the middle, for example, to put them on the top, then you may have to roll your own.

like image 22
Puppy Avatar answered Sep 28 '22 08:09

Puppy