☀️ A tool for validating data using JSON Schema and converting JSON Schema documents into different data-interchange formats
-
Updated
Oct 9, 2020 - Python
☀️ A tool for validating data using JSON Schema and converting JSON Schema documents into different data-interchange formats
Avrotize is a command-line tool for converting data structure definitions between different schema formats, using Apache Avro Schema as the integration schema model.
This library can convert a pydantic class to a avro schema or generate python code from a avro schema.
A pure python avro schema validator
☀️ Avro, Protobuf, Thrift on Swagger
A comprehensive healthcare management system that provides a REST API for accessing ICU patient data and medical appointments. The system leverages AWS Bedrock for AI-powered clinical assistance and real-time alert generation for critical patient conditions.
A Python ORM for Avro Schemas
An automated and extensible space-efficiency benchmark of JSON-compatible serialization specifications
Async Python Rest Client to interact against Schema Registry confluent server
Visualize your raw data from .avro files for the EmbracePlus device from Empatica
Package defining schema relevant to astronomy and astrophysics data, providing useful interfaces for interaction with those schema.
Benchmarking avro libraries in Python 3
Utilize your Avro schema to identify which records require unnesting, and convert that into dynamic Flink SQL statements.
Generate Apache Avro schemas for Pydantic data models.
CLI tool to generate HTML documentation for an Apache Avro schema
Avro Schema v1.8.2 (https://avro.apache.org/docs/1.8.2/spec.html) formatting with marshmallow - WIP
The project showcases a data pipeline integrating Kafka messaging and MongoDB for efficient logistics data ingestion. It uses Python for Kafka producer/consumer operations and MongoDB for data storage.
This repo contains details about real time streaming implementation using Confluent Cloud Kafka as Source and MongoDB as Sink. Thanks
This project demonstrates a real-time Change Data Capture pipeline. It captures data changes (Inserts, Updates, Deletes) from a MySQL source and streams them into PostgreSQL using Apache Kafka, Debezium, and Avro serialization.
Add a description, image, and links to the avro-schema topic page so that developers can more easily learn about it.
To associate your repository with the avro-schema topic, visit your repo's landing page and select "manage topics."