Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Delegate processing during Tomcat startup with Spring

I have defined a bean which needs to do some heavy processing during the @PostConstruct lifecycle phase (during start up).

As it stands, I submit a new Callable to an executor service with each iteration of the processing loop. I keep a list of the Future objects returned from these submissions in a member variable.

@Component
@Scope("singleton")
public class StartupManager implements ApplicationListener<ContextRefreshedEvent> {

    @Autowired
    private ExecutorService executorService;

    private final Map<Class<?>, Optional<Action>> actionMappings = new ConcurrentHashMap<>();
    private final List<Future> processingTasks = Collections.synchronizedList(new ArrayList<>());

    @PostConstruct
    public void init() throws ExecutionException, InterruptedException {

        this.controllers.getHandlerMethods().entrySet().stream().forEach(handlerItem -> {

            processingTasks.add(executorService.submit(() -> {

                // processing

            }));

        });

    }

}

This same bean implements the ApplicationListener interface, so it can listen for a ContextRefreshedEvent which allows me to detect when the application has finished starting up. I use this handler to loop through the list of Futures and invoke the blocking get method which ensures that all of the processing has occurred before the application continues.

@Override
public void onApplicationEvent(ContextRefreshedEvent  applicationEvent) {
    for(Future task : this.processingTasks) {
        try {
            task.get();
        } catch (InterruptedException | ExecutionException e) {
            throw new IllegalStateException(e.getMessage());
        }
    }
}

My first question... Is changing the actionMapping stream to a parallelStream going to achieve the same thing as submitting a task to the executor service? Is there a way I can pass an existing executor service into a parallel stream for it to use the thread pool size i've defined for the bean?

Secondly.. As part of the processing.. The actionMappings map is read and entries are put in there. It is sufficient enough to make this Map a ConcurrentHashMap to make it thread safe in this scenario?

And secondly is implementing the ApplicationListener interface and listening for the ContextRefreshedEvent the best way to detect when the application has startedup and therefore force complete the un-processed tasks by blocking? Or can this be done another way?

Thanks.

like image 726
Ricky Davis Avatar asked Nov 23 '25 17:11

Ricky Davis


1 Answers

  1. About using parallelStream(): No, and this is precisely the main drawback of using this method. It should be used only when the thread pool size doesn't matter, so I think your ExecutorService-based approach is fine.

    Since you are working with Java 8, you could as well use the CompletableFuture.supplyAsync() method, which has an overload that takes an Executor. Since ExecutorService extends Executor, you can pass it your ExecutorService and you're done!

  2. I think a ConcurrentHashMap is fine. It ensures thread safety in all its operations, especially when comes the time to add or modify entries.

  3. When is a ContextRefreshedEvent fired? According to the Javadoc:

    Event raised when an ApplicationContext gets initialized or refreshed.

    which doesn't guarantee your onApplicationEvent() method is to be called once and only once, that is, when your bean is properly initialized, which includes execution of the @PostConstruct-annotated method.

    I suggest you implement the BeanPostProcessor interface and put your Future-checkup logic in the postProcessAfterInitialization() method. The two BeanPostProcessormethods are called before and after the InitializingBean.afterPropertiesSet() method (if present), respectively.

I hope this will be helpful...

Cheers,

Jeff

like image 131
Jeff Morin Avatar answered Nov 26 '25 06:11

Jeff Morin