Using spring-boot and its caching mechanism, is it possible to automatically store all entities returned as a collection into the cache one by one?
For instance picture the following Repository method:
@Query("...")
List<Foo> findFooByBar(Bar bar);
I'd like to insert these in a Spring Cache, one by one, meaning there would be N insertions (one for each element in the list) rather than just one (the whole list).
Example:
@Query("...")
@CachePut(value = "foos", key = "result.each.id")
List<Foo> findFooByBar(Bar bar);
If we want to enable a cache mechanism in a Spring Boot application, we need to add cache dependency in the pom. xml file. It enables caching and configures a CacheManager.
Spring Boot auto-configures the cache infrastructure as long as caching support is enabled via the @EnableCaching annotation.
The @EnableCaching annotation triggers a post-processor that inspects every Spring bean for the presence of caching annotations on public methods. If such an annotation is found, a proxy is automatically created to intercept the method call and handle the caching behavior accordingly.
Sometime ago, another person asked a similar/related question on SO and I provided an answer along with an example.
As you know, by default, out-of-the-box Spring does not handle multiple keys/values in the way that you suggested, though I like your thinking here and your example/UC is valid.
Often times, however, you can achieve what you want using an intermediate solution with just a bit of extra work. Spring is an excellent example of the Open/Closed principle and the 2 primary abstractions in Spring's Cache Abstraction is the Cache and CacheManager interfaces.
Typically, you can pick an existing implementation and "adapt" either the Cache
or the CacheManager
, or both, as I have done in my example.
Though not as ideal or convenient, hopefully this will give you some ideas until perhaps SPR-15213 is considered (though maybe not).
Cheers, John
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With