Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How do you update a value in a Quartz JobDataMap?

I'm using quartz-scheduler 1.8.5. I've created a Job implementing StatefulJob. I schedule the job using a SimpleTrigger and StdSchedulerFactory.

It seems that I have to update the Trigger's JobDataMap in addition to the JobDetail's JobDataMap in order to change the JobDataMap from inside the Job. I'm trying to understand why it's necessary to update both? I noticed that the JobDataMap is set to dirty. Maybe I have to explicitly save it or something?

I'm thinking I'll have to dig into the source code of Quartz to really understand what is going on here, but I figured I'd be lazy and ask first. Thanks for any insight into the inner workings of JobDataMap!

Here's my job:

public class HelloJob implements StatefulJob {

    public HelloJob() {
    }

    public void execute(JobExecutionContext context)
            throws JobExecutionException {

        int count = context.getMergedJobDataMap().getInt("count");
        int count2 = context.getJobDetail().getJobDataMap().getInt("count");
        //int count3 = context.getTrigger().getJobDataMap().getInt("count");
        System.err.println("HelloJob is executing. Count: '"+count+"', "+count2+"'");

        //The count only gets updated if I updated both the Trigger and 
                // JobDetail DataMaps. If I only update the JobDetail, it doesn't persist. 
        context.getTrigger().getJobDataMap().put("count", count++);
        context.getJobDetail().getJobDataMap().put("count", count++);

        //This has no effect inside the job, but it works outside the job
        try {
            context.getScheduler().addJob(context.getJobDetail(), true);
        } catch (SchedulerException e) {
            // TODO Auto-generated catch block
            e.printStackTrace();
        }

        //These don't seem to persist between jobs
        //context.put("count", count++);
        //context.getMergedJobDataMap().put("count", count++);
    }
}

Here's how I'm scheduling the job:

try {
    // define the job and tie it to our HelloJob class
    JobDetail job = new JobDetail(JOB_NAME, JOB_GROUP_NAME,
            HelloJob.class);
    job.getJobDataMap().put("count", 1);
    // Trigger the job to run now, and every so often
    Trigger trigger = new SimpleTrigger("myTrigger", "group1",
            SimpleTrigger.REPEAT_INDEFINITELY, howOften);

    // Tell quartz to schedule the job using our trigger
    sched.scheduleJob(job, trigger);
    return job;
} catch (SchedulerException e) {
    throw e;
}

Update:

Seems that I have to put the value into the JobDetail's JobDataMap twice to get it to persist, this works:

public class HelloJob implements StatefulJob {

    public HelloJob() {
    }

    public void execute(JobExecutionContext context)
            throws JobExecutionException {

        int count = (Integer) context.getMergedJobDataMap().get("count");
        System.err.println("HelloJob is executing. Count: '"+count+"', and is the job stateful? "+context.getJobDetail().isStateful());
        context.getJobDetail().getJobDataMap().put("count", count++);
        context.getJobDetail().getJobDataMap().put("count", count++);
    }
}

This seems like a bug, maybe? Or maybe there's a step I'm missing to tell the JobDetail to flush the contents of its JobDataMap to the JobStore?

like image 532
Upgradingdave Avatar asked Apr 29 '11 17:04

Upgradingdave


2 Answers

I think your problem is with using the postfix ++ operator - when you do:

context.getJobDetail().getJobDataMap().put("count", count++);  

you're setting the value in the map to count and THEN incrementing count.

To me it looks like you wanted:

context.getJobDetail().getJobDataMap().put("count", ++count);  

which would only need to be done once.

like image 173
Greg Mitchell Avatar answered Oct 09 '22 10:10

Greg Mitchell


As you know, in Quartz, the trigger and the job are separate, rather than combined with some schedulers. They might be allowing you to add values to the datamap which are specific at the trigger level rather than the job level, etc.

I think it allows you to execute the same end job with a different set of data, but still have some common data at the job level.

like image 21
Chris Pritchard Avatar answered Oct 09 '22 10:10

Chris Pritchard