I have to write data into multiple tables using Spring batch. For example, I have two tables: user table and information table.
I don't want to use stored procedure, but JdbcItemwriter does not allow executing multiple SQL. I want to execute multiple SQL in JdbcItemWriter.
I'm looking for pointers and general advice about the optimal way to perform this.
I usually need to do multiple inserts for each element. For example, when reading from a file (like a structured XML).
To do that, I usually implement a specific ItemWriter class where the properties are the specific JdbcItemWriter classes for each of my tables. Here's an example:
package my.package.writer;
import my.package.model.tbl.MyMainObject;
import my.package.model.tbl.MySubObject1;
import my.package.model.tbl.MySubObject2;
import org.springframework.batch.item.ItemWriter;
import org.springframework.transaction.annotation.Propagation;
import org.springframework.transaction.annotation.Transactional;
public class MainMultipleWriter implements ItemWriter<PkjwdPolizze>
{
private CounterBean counterBean;
private ItemWriter<MyMainObject> writerMyMainObject;
private ItemWriter<MySubObject1> writerSubObject1;
private ItemWriter<MySubObject2> writerSubObject2;
@Transactional(readOnly = false, propagation = Propagation.REQUIRED)
public void write(List<? extends MyMainObject> items) throws Exception
{
// Main Table WRITER
writerMyMainObject.write(items);
// Secondary Table WRITERS
for (MyMainObject item : items)
{
writerSubObject1.write(item.getLstMySubObject1());
writerSubObject2.write(item.getLstMySubObject2());
}
}
}
After this you configure your job to use your MultiWriter class instead of a simple JdbcItemWriter. Here's an example of that configuration:
<!-- job -->
<batch:job id="myJob" job-repository="jobRepository" restartable="false">
<batch:step id="myStep" >
<batch:tasklet transaction-manager="transactionManager">
<batch:chunk commit-interval="2500" reader="myReader"
processor="myProcessor" writer="myMultiWriterBean" />
</batch:tasklet>
</batch:step>
</batch:job>
<!-- Writers -->
<bean id="myMainObjectWriter" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="itemPreparedStatementSetter">
<bean class="my.package.setters.MyMainObjectStatementSetter"></bean>
</property>
<property name="sql" value="--insert_statement--" />
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="mySubObject1Writer" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="itemPreparedStatementSetter">
<bean class="my.package.setters.MySubObject1StatementSetter"></bean>
</property>
<property name="sql" value="--insert_statement--" />
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="mySubObject2Writer" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
<property name="itemPreparedStatementSetter">
<bean class="my.package.setters.MySubObject2StatementSetter"></bean>
</property>
<property name="sql" value="--insert_statement--" />
<property name="dataSource" ref="dataSource" />
</bean>
<bean id="myMultiWriterBean" class="my.package.writer.MainMultipleWriter">
<property name="writerMyMainObject" ref="mySubObject1Writer" />
<property name="writerSubObject1" ref="mySubObject2Writer" />
<property name="writerSubObject2" ref="mySubObject2Writer" />
</bean>
I think this is a cleaner implementation, because you respect the chunk structure and don't have to worry about rollback, transaction and data integrity.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With