Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Caffeine: How to come up with an appropriate cache size

I have a computationally intensive one-off offline processing task that takes me few hours to run and I am using Caffeine as my in-memory cache. What is a good heuristic to set the maximum cache size? I am running my Java program with 8GB of RAM and I am willing to give the cache about 4GB of it but I am unsure how memory translates to actual size of my cache entires. I decided to go with .softValues() to let the JVM decide but I ran into the following words in the JavaDoc of Caffeine:

Warning: in most circumstances it is better to set a per-cache maximum size instead of using soft references. You should only use this method if you are well familiar with the practical consequences of soft references.

like image 326
pathikrit Avatar asked Sep 15 '16 04:09

pathikrit


People also ask

How do I choose a cache size?

Within these hard limits, the factors that determine appropriate cache size include the number of users working on the machine, the size of the files with which they usually work, and (for a memory cache) the number of processes that usually run on the machine.

What is maximum size in caffeine cache?

The Caffeine spec define the cache maximum size as 500 and a time to live of 10 minutes.

What is a caffeine cache?

Caffeine cache is a high-performance cache library for Java.

Is caffeine cache in memory?

Cache. Caffeine provides an in-memory cache using a Google Guava inspired API. The improvements draw on our experience designing Guava's cache and ConcurrentLinkedHashMap.


1 Answers

Soft references are conceptually attractive, but typically hurt performance in long-running JVMs. This is because they create heap pressure by filling up the old generation and are only collected during a full GC. This can result in GC thrashing where each time enough memory is freed, it is quickly consumed and another full GC is required. For latency sensitive applications this is further impacted as eviction is global, as there is no way to hint which caches are the most critical.

Soft references shouldn't be a default, go to strategy. It might be a reasonable simplification in a throughput, non-user facing task. But when GC time, latency, and predictable performance are important then it can be dangerous.

Unfortunately the best answer for sizing is to guess, measure, and repeat. Export the statistics, try a setting, and adjust appropriately. The hit rate curve can be obtained by capturing an access trace (log of key hashes) and simulating it with different sizes. Its interesting data but usually a few simple runs for tuning is good enough.

like image 141
Ben Manes Avatar answered Sep 22 '22 14:09

Ben Manes