Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why is the Simple Least Recently Used Cache Mechanism used?

I am using JProfiler to inspect a Java microservice while I simulate concurrent users with JMeter. With JProfiler I can see: OverviewThread HistoryMonitor UsageMonitor History Navigating to the method find(), I realized the method has synchronized keyword method

In my opinion this method causes the problem with blocked threads. But why is it used? May I disabled this cache mechanism from microservice? The microservice is written in Java and it uses Spring, Spring Boot.

Thank you

I added screenshot from the same JProfiler snapshot for Monitor History to show the time spent in the ResolvedTypeCache class. Sometimes the time is less but sometimes is huge. History Monitor Update

like image 737
Adrian Avatar asked Mar 02 '18 08:03

Adrian


People also ask

What is LRU and how you can implement it?

Least Recently Used (LRU) is a common caching strategy. It defines the policy to evict elements from the cache to make room for new elements when the cache is full, meaning it discards the least recently used items first. Let's take an example of a cache that has a capacity of 4 elements.

How does LRU cache algorithm work?

The Least Recently Used (LRU) cache is a cache eviction algorithm that organizes elements in order of use. In LRU, as the name suggests, the element that hasn't been used for the longest time will be evicted from the cache.

Which data structure is used in LRU?

LRU cache could easily be implemented through Queue Data Structure. Explanation: The Least Recently Used (LRU) cache is a type of cache driving technique which arranges data in order based on their usage. Queue which is implemented through the doubly sided linked list.

What is meant by least recently used?

(operating systems) (LRU) A rule used in a paging system which selects a page to be paged out if it has been used (read or written) less recently than any other page. The same rule may also be used in a cache to select which cache entry to flush.


1 Answers

Why is LRU used? Presumably because there's something worth caching.

Why is it synchronized? Because the LinkedHashMap that's being used as a cache here is not thread-safe. It does provide the idiomatic LRU mechanism though.

It could be replaced with a ConcurrentMap to mitigate the synchronization, but then you'd have a constantly growing non-LRU cache and that's not at all the same thing.

Now there's not much you can do about it. The best idea might be to contact the devs and let them know about this. All in all the library may just not be suitable for the amount of traffic your putting through it, or you may be simulating the kind of traffic that would exhibit pathological behaviour, or you may overestimate the impact of this (not offense, I'm just very Mulderesque about SO posts, i.e. "trust no1").

Finally, uncontested synchronization is cheap so if there's a possibility to divide traffic to multiple instances of the cache it may affect performance in some way (not necessarily positive). I don't know about the architecture of the library though, so it may be completely impossible.

like image 54
Kayaman Avatar answered Nov 08 '22 22:11

Kayaman