Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

LRU with Caffeine

Tags:

java

lru

caffeine

I'm trying to use Caffeine as LRU cache, so entries that were added first will be evicted first. Ran this code:

final Cache<Object, Object> map = Caffeine.newBuilder()
            .maximumSize(10)
            .initialCapacity(10)
            .build();

for (long i=0; i<20;i++) {
        map.put(i, i);
}

map.cleanUp();
System.out.println(map.ge.getAllPresent(map.asMap().keySet()));

Which prints :

{0=0, 1=1, 2=2, 3=3, 4=4, 5=5, 6=6, 7=7, 8=8, 19=19}

But I expected

{10=10, 11=11, 12=12, 13=13, 14=14, 15=15, 16=16, 17=17, 18=18, 19=19}

What am I doing wrong?

like image 953
bgme_one Avatar asked Jan 09 '16 14:01

bgme_one


People also ask

Is caffeine cache LRU?

Caffeine does not implement LRU as its cache eviction policy. Instead, Caffeine uses a policy called TinyLFU.

Is caffeine cache thread safe?

Cache entries are manually added using get(Object, Function) or put(Object, Object) , and are stored in the cache until either evicted or manually invalidated. Implementations of this interface are expected to be thread-safe, and can be safely accessed by multiple concurrent threads.

How does caffeine cache work?

Caffeine provides flexible construction to create a cache with a combination of the following optional features: automatic loading of entries into the cache, optionally asynchronously. size-based eviction when a maximum is exceeded based on frequency and recency.

What is maximum size in caffeine cache?

Caffeine Cache Configuration The Caffeine spec define the cache maximum size as 500 and a time to live of 10 minutes.


1 Answers

Caffeine does not implement LRU as its cache eviction policy. Instead, Caffeine uses a policy called TinyLFU. The Caffeine documentation includes a page on Efficiency, which discusses the rationale for this design choice. Quoting that page:

TinyLfu relies on a frequency sketch to probabilistically estimate the historic usage of an entry.

Since Caffeine does not in fact implement LRU, I don't think you can reliably expect it to exhibit strict LRU behavior when you inspect the entries in the cache.

If you absolutely must have LRU behavior, then the JDK standard LinkedHashMap is a good, straightforward choice. You would need to subclass it and override removeEldestEntry with logic to signal when the cache has grown larger than you want it. If multi-threaded usage is necessary, then you'll need to wrap operations with appropriate synchronization.

Caffeine was heavily inspired by the Guava Cache, which similarly provides concurrent access and has approximate LRU behavior. A quick test of your code against the Guava cache shows similar results. I am not aware of any standard library that would provide predictable, externally observable LRU results and true concurrent access without a coarse-grained lock.

You might reconsider whether it's really a requirement to have strict, externally observable LRU results. By its nature, a cache is fast temporary storage to provide optimized lookups. I would not expect program behavior to change drastically dependent on whether the cache implements strict LRU, an approximation of LRU, LFU, or some other eviction policy.

This prior question also has a great discussion of LRU cache options in Java.

How would you implement an LRU cache in Java?

like image 169
Chris Nauroth Avatar answered Sep 25 '22 23:09

Chris Nauroth