Spring Cloud Data Flow Primary Use Case

Mohammad Masudu Rahaman
Founder at NeedHelps?
Mostly the use cases are related to building a data pipeline. There are multiple microservices that are working in the Spring Cloud Data Flow infrastructure, and we are building a data pipeline, mostly a step-by-step process processing data using Kafka. Most of the processor sync and sources are being developed based on the customers' business requirements or use cases. In the example of the bank we work with, we are actually building a document analysis pipeline. There are some defined sources where we get the documents. Later on, we extract some information united from the summary and we export the data to multiple destinations. We may export it to the POGI Database, and/or to Kafka Topic. For CoreLogic, we were actually doing data import to elastic. We had a BigQuery data source. And from there we did some transformation of the data then imported it in the elastic clusters. That was the ETL solution. View full review »
Saket Puranik
Senior Platform Associate L2 at a tech services company with 10,001+ employees
In my last project, I worked on Spring Cloud Data Flow (SCDF). We created a stream using this product and we had a Spring Kafka Binder as well. The project included creating a data lake for our clients. The platform that we created maintained a data lake for an internet banking user and provided an out-of-the-box solution for integration with it. We used SCDF to gather the data, as well as our ETL (extract, transform, and load) pipelines. View full review »
Find out what your peers are saying about VMware, StreamSets, TIBCO and others in Data Integration Tools. Updated: November 2020.
447,846 professionals have used our research since 2012.