I am trying to parse a CSV file with FlatFileItemReader. This CSV contains some quoted newline characters as shown below.
email, name
[email protected], "NEW NAME
ABC"
But this parsing is failing with required fields are 2 but actual is 1.
What I am missing in my FlatFileReader configuration?
<property name="lineMapper">
<bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
<!-- The lineTokenizer divides individual lines up into units of work -->
<property name="lineTokenizer">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
<!-- Names of the CSV columns -->
<property name="names"
value="email,name" />
</bean>
</property>
<!-- The fieldSetMapper maps a line in the file to a Product object -->
<property name="fieldSetMapper">
<bean
class="com.abc.testme.batchjobs.util.CustomerFieldSetMapper" />
</property>
</bean>
</property>
To read csv data from a file, use the ItemReader interface. The spring boot batch reads the csv file using the FlatFileItemReader class. Each line of the file is read by the FlatFileItemReader class and converted to an Employee object. If an error occurs, an exception will be thrown.
Spring boot batch reads table data from the source database using jpa in item reader, transforms it to destination table format in the item processor and stores the data in another database table.
out of the box the FlatFileItemReader uses a SimpleRecordSeparatorPolicy, for your usecase
you need to set the DefaultRecordSeparatorPolicy
Cited from its javadoc:
A RecordSeparatorPolicy that treats all lines as record endings, as long as they do not have unterminated quotes, and do not end in a continuation marker.
example xml configuration
<bean id="reader"
class="org.springframework.batch.item.file.FlatFileItemReader">
...
<property name="recordSeparatorPolicy">
<bean class="org.springframework.batch.item.file.separator.DefaultRecordSeparatorPolicy" />
</property>
...
</bean>
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With