Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

ThreadPoolExecutor Block When Queue Is Full?

I am trying to execute lots of tasks using a ThreadPoolExecutor. Below is a hypothetical example:

def workQueue = new ArrayBlockingQueue<Runnable>(3, false) def threadPoolExecutor = new ThreadPoolExecutor(3, 3, 1L, TimeUnit.HOURS, workQueue) for(int i = 0; i < 100000; i++)     threadPoolExecutor.execute(runnable) 

The problem is that I quickly get a java.util.concurrent.RejectedExecutionException since the number of tasks exceeds the size of the work queue. However, the desired behavior I am looking for is to have the main thread block until there is room in the queue. What is the best way to accomplish this?

like image 587
ghempton Avatar asked Aug 10 '10 04:08

ghempton


People also ask

What happens when thread pool is full?

By default, the MaxThreads of the ThreadPool is very high. Usually you'll never get there, your app will crash first. So when all threads are busy the new tasks are queued and slowly, at most 1 per 500 ms, the TP will allocate new threads.

What happens when thread pool is full in Java?

Once 'max' number of threads are reached, no more will be created, and new tasks will be queued until a thread is available to run them.

Is executor submit blocking?

Right, this ExecutorService blocks tasks on submission without blocking caller thread. Job just getting submitted and will be processed asynchronously when there will be enough system resources for it.


1 Answers

In some very narrow circumstances, you can implement a java.util.concurrent.RejectedExecutionHandler that does what you need.

RejectedExecutionHandler block = new RejectedExecutionHandler() {   rejectedExecution(Runnable r, ThreadPoolExecutor executor) {      executor.getQueue().put( r );   } };  ThreadPoolExecutor pool = new ... pool.setRejectedExecutionHandler(block); 

Now. This is a very bad idea for the following reasons

  • It's prone to deadlock because all the threads in the pool may die before the thing you put in the queue is visible. Mitigate this by setting a reasonable keep alive time.
  • The task is not wrapped the way your Executor may expect. Lots of executor implementations wrap their tasks in some sort of tracking object before execution. Look at the source of yours.
  • Adding via getQueue() is strongly discouraged by the API, and may be prohibited at some point.

A almost-always-better strategy is to install ThreadPoolExecutor.CallerRunsPolicy which will throttle your app by running the task on the thread which is calling execute().

However, sometimes a blocking strategy, with all its inherent risks, is really what you want. I'd say under these conditions

  • You only have one thread calling execute()
  • You have to (or want to) have a very small queue length
  • You absolutely need to limit the number of threads running this work (usually for external reasons), and a caller-runs strategy would break that.
  • Your tasks are of unpredictable size, so caller-runs could introduce starvation if the pool was momentarily busy with 4 short tasks and your one thread calling execute got stuck with a big one.

So, as I say. It's rarely needed and can be dangerous, but there you go.

Good Luck.

like image 74
Darren Gilroy Avatar answered Sep 27 '22 17:09

Darren Gilroy