I have plans to load cache concurrently from multiple threads. The simplest form of this would be:
IgniteCache<Integer, Integer> cache = ignite.getOrCreateCache("ints");
ExecutorService es = Executors.newFixedThreadPool(Runtime.getRuntime().availableProcessors());
for (int i = 0; i < 20000000; i++) {
int t = i;
es.submit(() -> {
cache.put(t, t);
});
}
Is it safe to do like that? I read the documentation of the method:
Associates the specified value with the specified key in the cache. If the Cache previously contained a mapping for the key, the old value is replaced by the specified value. (A cache c is said to contain a mapping for a key k if and only if c.containsKey(k) would return true.)
There is no any words about thread-safety. So is it safe to put in IgniteCache
concurrently?
Luckily enough, we can easily create a pool of connections to Redis for us to reuse on demand. This pool is thread safe and reliable, as long as we return the resource to the pool when we're done with it.
In-memory cache is a storage layer placed between applications and databases. The cache keeps your hot data in memory to offload existing databases and accelerate applications.
time: This is the base package of the new Java Date Time API. All the major base classes are part of this package, such as LocalDate, LocalTime, LocalDateTime, Instant, Period, Duration, etc. All of these classes are immutable and thread-safe.
To delete a cache from all cluster nodes, call the destroy() method. Ignite ignite = Ignition. ignite(); IgniteCache<Long, String> cache = ignite. cache("myCache"); cache.
The answer is YES, all Ignite APIs are thread safe and can be used concurrently from multiple threads.
However, doing individual puts is not effective way to do data loading, there are better techniques for this. Please refer to this page for details: https://apacheignite.readme.io/docs/data-loading
Yes, all IgniteCache
methods are thread-safe, as are most other APIs.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With