Spring Cloud Data Flow is a microservices-based toolkit for building streaming and batch data processing pipelines in Cloud Foundry and Kubernetes.
Data processing pipelines consist of Spring Boot apps, built using the Spring Cloud Stream or Spring Cloud Task microservice frameworks.
This makes Spring Cloud Data Flow ideal for a range of data processing use cases, from import/export to event streaming and predictive analytics.
Architecture: The Spring Cloud Data Flow Server is a Spring Boot application that provides RESTful API and REST clients (Shell, Dashboard, Java DSL). A single Spring Cloud Data Flow installation can support orchestrating the deployment of streams and tasks to Local, Cloud Foundry, and Kubernetes.
Familiarize yourself with the Spring Cloud Data Flow architecture and feature capabilities.
Deployer SPI: A Service Provider Interface (SPI) is defined in the Spring Cloud Deployer project. The Deployer SPI provides an abstraction layer for deploying the apps for a given streaming or batch data pipeline, and managing the application lifecycle.
Spring Cloud Deployer Implementations:
Domain Model: The Spring Cloud Data Flow domain module includes the concept of a stream that is a composition of Spring Cloud Stream applications in a linear data pipeline from a source to a sink, optionally including processor application(s) in between. The domain also includes the concept of a task, which may be any process that does not run indefinitely, including Spring Batch jobs.
Application Registry: The App Registry
maintains the metadata of the catalog of reusable applications.
For example, if relying on Maven coordinates, an application URI would be of the format:
maven://<groupId>:<artifactId>:<version>
.
Shell/CLI: The Shell connects to the Spring Cloud Data Flow Server's REST API and supports a DSL that simplifies the process of defining a stream or task and managing its lifecycle.
Community Implementations: There are also community maintained Spring Cloud Data Flow implementations that are currently based on the 1.7.x series of Spring Cloud Data Flow.
The Apache YARN implementation has reached end-of-line status. Let us know at Gitter if you are interested in forking the project to continue developing and maintaining it.
Clone the repo and type
$ ./mvnw clean install
Looking for more information? Follow this link.
When using Git on Windows to check out the project, it is important to handle line-endings correctly during checkouts.
By default Git will change the line-endings during checkout to CRLF
. This is, however, not desired for Spring Cloud Data Flow
as this may lead to test failures under Windows.
Therefore, please ensure that you set Git property core.autocrlf
to false
, e.g. using: $ git config core.autocrlf false
.
Fore more information please refer to the Git documentation, Formatting and Whitespace.
We welcome contributions! Follow this link for more information on how to contribute.
-
The directory ./src/eclipse has two files for use with code formatting,
eclipse-code-formatter.xml
for the majority of the code formatting rules andeclipse.importorder
to order the import statements. -
In eclipse you import these files by navigating
Windows -> Preferences
and then the menu itemsPreferences > Java > Code Style > Formatter
andPreferences > Java > Code Style > Organize Imports
respectfully. -
In
IntelliJ
, install the pluginEclipse Code Formatter
. You can find it by searching the "Browse Repositories" under the plugin option withinIntelliJ
(Once installed you will need to reboot Intellij for it to take effect). Then navigate toIntellij IDEA > Preferences
and select the Eclipse Code Formatter. Select theeclipse-code-formatter.xml
file for the fieldEclipse Java Formatter config file
and the fileeclipse.importorder
for the fieldImport order
. Enable theEclipse code formatter
by clickingUse the Eclipse code formatter
then click the OK button. ** NOTE: If you configure theEclipse Code Formatter
fromFile > Other Settings > Default Settings
it will set this policy across all of your Intellij projects.
Spring Cloud Data Flow is Open Source software released under the Apache 2.0 license.