In this tutorial, we’ll explore how Tabsdata enables Pub/Sub for Tables with source and destination data being a locally hosted PostgreSQL database.
We will start by setting up Tabsdata, and PostgreSQL. Then we register and run a publisher that reads data from PostgreSQL, and publishes that as a table to Tabsdata.
Following that, we will register a subscriber that subscribes to this published table, filters some data, and exports it to PostgreSQL. We will then demonstrate that when the publisher is re-run to load new data, the subscriber automatically writes it to Postgres.
In a real-world scenario, your data source could be any other database, or storage location, while the subscriber could write data to various endpoints such as a database or file system. You can check the list of source and destination connectors in Tabsdata documentation.
In this first tutorial, we’ll explore how Tabsdata enables Pub/Sub for Tables using a CSV file input.
We will start by setting up Tabsdata and registering a publisher that reads data from a CSV file, selects some aspects of it, and publishes it as a table within the system.
Following that, we will register a subscriber that subscribes to this published table, and exports it to the file system in a JSON format.
We will then demonstrate that when the publisher is rerun to load new data, the subscriber automatically writes it to the external system.