Android Library for Async Data Loading and Caching
-
Updated
Dec 6, 2019 - Java
Android Library for Async Data Loading and Caching
基于流程,事件驱动,可拓展,响应式,轻量级的规则引擎。
The Frank!Framework is an easy-to-use, stateless integration framework which allows (transactional) messages to be modified and exchanged between different systems.
Asakusa Framework
Röda: A stream-oriented scripting language
The Draco Generic Enabler is an alternative data persistence mechanism for managing the history of context. It is based on Apache NiFi and is a dataflow system based on the concepts of flow-based programming. It supports powerful and scalable directed graphs of data routing, transformation, and system mediation logic and also offers an intuitive…
store2store helps you to keep in sync your datas between differents sources.
Synchronize Realm with another Store. Realm Implementation for store2store library
Dataflow programming for Java - High performance (parallel) execution of tasks according to their dependencies
invoke a python-trained tensorflow model from java programs,Real-time image recognition,and deploy the applications to springcloud dataflow local server
Kafka consumer application which reads messages from Kafka and based on the schema registered in a schema registry and enriches the data. This is just a prototype on how we can efficiently evolve a schema and read messages with BACKWARD compatibility with out breaking the consumers. This application can be used as a base for a stateless stream p…
A series of NiFi processors that facilitate ingestion of data into NileDB Core platform.
Data Insight Engine System is a comprehensive data analytics tool that helps users collect, analyze, and display data in a straightforward and interactive manner, aiding them in making informed decisions.
Simple biz flow or data flow for business orchestration
Scalable directed graphs of data routing, transformation, and system mediation logic in single library
Kafka producer application which publishes messages to Kafka and registers the schema of the message in a schema registry
Application which reads json data from kafka. This application is a prototype on how we can consume JSON messages from KAFKA and persist it
Application which posts json data to kafka. This application is a prototype on how we can push Json messages to kafka through a REST endpoint
The Data Pulse pipeline processes and transforms web-scraped pageviews using Apache Beam and Google Cloud Dataflow. It reads JSON lines, parses them into PageView objects, filters for "product" post types, enriches with country info, and writes to Google BigQuery. Robust logging and error handling ensure data integrity
Add a description, image, and links to the data-flow topic page so that developers can more easily learn about it.
To associate your repository with the data-flow topic, visit your repo's landing page and select "manage topics."