Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spring Cloud Data Flow ignores datasources configured by a spring batch app

I'm setting up an instance of Spring Cloud Data Flow. I've run the following commands:

1. Run skipper server: java -jar spring-cloud-skipper-server-2.0.3.RELEASE.jar &
2. Run Dataflow server: java -jar spring-cloud-dataflow-server-2.1.2.RELEASE.jar \
    --spring.datasource.url=jdbc:postgresql://10.136.66.44:8080/springclouddataflow \
    --spring.datasource.username=springclouddataflow \
    --spring.datasource.password=123456 \
    --spring.datasource.driver-class-name=org.postgresql.Driver \
    --server.port=80 &

In the second step, I'm using a postgres database, not the default h2.

I've developed a spring boot job using spring batch to be deployed in this plataform. The job uses two datasources: springclouddataflow for Spring and task Metadata and billrun for my business logic. When I run the app locally, it persists metadata in springclouddataflow and my business data in billrun, as expected. The problem is when I try to execute de job inside the Spring Cloud Dataflow. The plataform ignores my configured business logic database and uses only the springclouddataflow database, which is supposed to store metadata only.

I've searched in the oficial documentation. It explains how to use a different database for metadata storage and how to configure several databases in an application. I've followed the instructions but without success.

application.properties

logging.level.org.springframework.cloud.task=debug
spring.datasource.initialization-mode=always
spring.batch.initialize-schema=always
spring.application.name=Bill Run
spring.datasource.jdbc-url=jdbc:postgresql://10.136.66.44:8080/springclouddataflow?useSSL=false
spring.datasource.username=springclouddataflow
spring.datasource.password=123456
spring.datasource.driver-class-name=org.postgresql.Driver
app.datasource.jdbc-url=jdbc:postgresql://10.136.66.44:8080/billrun?useSSL=false
app.datasource.username=springclouddataflow
app.datasource.password=123456
app.datasource.driver-class-name=org.postgresql.Driver

DatasourceConfiguration

@Configuration
public class DatasourceConfiguration {
    @Bean(name = "appDatasource")
    @ConfigurationProperties(prefix = "app.datasource")
    public DataSource sourceDataSource() {
        return DataSourceBuilder.create().build();
    }

    @Bean
    @Primary
    @ConfigurationProperties(prefix = "spring.datasource")
    public DataSource springDataSource() {
        return DataSourceBuilder.create().build();
    }

    @Bean
    public TaskConfigurer taskConfigurer() {
        return new DefaultTaskConfigurer(springDataSource());
    }
}

BillingConfiguration

@Configuration
@EnableTask
@EnableBatchProcessing
public class BillingConfiguration {
    @Autowired
    public JobBuilderFactory jobBuilderFactory;

    @Autowired
    public StepBuilderFactory stepBuilderFactory;

    @Value("${usage.file.name:classpath:usageinfo.json}")
    private Resource usageResource;

    @Bean
    public Job job1(ItemReader<Usage> reader, ItemProcessor<Usage, Bill> itemProcessor, ItemWriter<Bill> writer) {
        Step step = stepBuilderFactory.get("BillProcessing").<Usage, Bill>chunk(1).reader(reader)
                .processor(itemProcessor).writer(writer).build();

        return jobBuilderFactory.get("BillJob").incrementer(new RunIdIncrementer()).start(step).build();
    }

    @Bean
    public JsonItemReader<Usage> jsonItemReader() {

        ObjectMapper objectMapper = new ObjectMapper();
        JacksonJsonObjectReader<Usage> jsonObjectReader = new JacksonJsonObjectReader<>(Usage.class);
        jsonObjectReader.setMapper(objectMapper);

        return new JsonItemReaderBuilder<Usage>().jsonObjectReader(jsonObjectReader).resource(usageResource)
                .name("UsageJsonItemReader").build();
    }

    @Bean
    public ItemWriter<Bill> jdbcBillWriter(@Qualifier("appDatasource") DataSource dataSource) {
        JdbcBatchItemWriter<Bill> writer = new JdbcBatchItemWriterBuilder<Bill>().beanMapped().dataSource(dataSource)
                .sql("INSERT INTO BILL_STATEMENTS (id, first_name, "
                        + "last_name, minutes, data_usage,bill_amount) VALUES "
                        + "(:id, :firstName, :lastName, :minutes, :dataUsage, " + ":billAmount)")
                .build();
        return writer;
    }

    @Bean
    ItemProcessor<Usage, Bill> billProcessor() {
        return new BillProcessor();
    }
}

I've tried passing the database properties as arguments to the task:

enter image description here

When I look at the datasource, there's only data persisted in springclouddataflow. How can I tell spring cloud data flow to use my application datasource (billrun)?

like image 315
Giuliana Bezerra Avatar asked Nov 07 '22 15:11

Giuliana Bezerra


1 Answers

It looks like you are customizing the Spring Cloud Data Flow server to use the application's data sources which I think is not needed.

You can start your SCDF server as you posted above:

1. Run skipper server: java -jar spring-cloud-skipper-server-2.0.3.RELEASE.jar &
2. Run Dataflow server: java -jar spring-cloud-dataflow-server-2.1.2.RELEASE.jar \
    --spring.datasource.url=jdbc:postgresql://10.136.66.44:8080/springclouddataflow \
    --spring.datasource.username=springclouddataflow \
    --spring.datasource.password=123456 \
    --spring.datasource.driver-class-name=org.postgresql.Driver \
    --server.port=80 &

and, have your Spring batch application passing its Data Source properties as Spring Boot properties instead of using a custom data source configuration as you did above.

You can find the Spring Batch application development guide using a similar approach here

like image 78
Ilayaperumal Gopinathan Avatar answered Nov 15 '22 07:11

Ilayaperumal Gopinathan