Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Spring Integration or Spring Cloud Data Flow

I'm in the process of moving some of my application to microservices. One example service is a RSS crawler which uses Spring Integration to send items to Kafka. I also see there is Spring Cloud Data Flow which uses Spring Integration and Spring Batch under the hood.

If I used Spring Cloud Data Flow, would I use this within each microservice or is it used as orchestration between microservices?

I'm trying understand the benefits of introducing Spring Cloud Data Flow as opposed to keeping the microservices as light as possible using Spring Integration.

Any advice would be appreciated.

like image 335
Swordfish Avatar asked Apr 19 '18 15:04

Swordfish


People also ask

What is the difference between spring and spring Cloud?

Spring Boot is a Java-based open-source framework for developing services. Its major goal is to cut down on development and testing time. Their apps don't need as much Spring setup as other Spring applications. Spring Cloud is a tool for centralizing form management.

What is spring Cloud data flow used for?

Spring Cloud Data Flow is a cloud-native toolkit for building real-time data pipelines and batch processes. Spring Cloud Data Flow is ready to be used for a range of data processing use cases like simple import/export, ETL processing, event streaming, and predictive analytics.

What are the advantages of using spring Cloud?

Spring Cloud provides tools for developers to quickly build some of the common patterns in distributed systems (e.g. configuration management, service discovery, circuit breakers, intelligent routing, micro-proxy, control bus, one-time tokens, global locks, leadership election, distributed sessions, cluster state).

Is Spring Cloud part of spring boot?

Spring boot is a java based framework to work con auto-configuration in Web Application. Spring cloud is part of Spring boot, where Spring boot is Stand Alone, App – Centric Application framework.


1 Answers

The SI based "RSS crawler" service can be packaged as a Spring Cloud Stream application. Once you have done that, your "RSS crawler" turns into a standalone event-driven microservice that automatically can either talk to Kafka, Rabbit, or other brokers (depending on the binder implementation in the classpath). The same app is a portable workload that can run on any public or private cloud platforms.

Once you have an "n" number of such standalone applications, you'd need a higher-level orchestration layer to compose the applications into a coherent data pipeline. Spring Cloud Data Flow helps with it. You can build a pipeline like the following with SCDF's DSL.

stream create foo --definition "rss-crawler | filter | transform | cassandra"

Here, four applications come together to form a data pipeline. Each one of them is a Spring Cloud Stream application that can be independently developed and tested in isolation. Finally, SCDF would deploy the applications onto target platforms like Cloud Foundry or Kubernetes as native applications.

Hope this further clarifies.

like image 70
Sabby Anandan Avatar answered Oct 23 '22 08:10

Sabby Anandan