[Plugin] airflow-postgres-csv — Airflow 3 operators for bulk PostgreSQL ↔ CSV transfers #62450
Unanswered
Redevil10
asked this question in
Show and tell
Replies: 1 comment
-
|
You can submit it to https://airflow.apache.org/ecosystem/ |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Hi everyone 👋
I built a small pip-installable package that provides two operators for bulk PostgreSQL ↔ CSV data transfers using PostgreSQL's native
COPYcommand:PostgresToCsvOperator— run a SQL query and stream results to a CSV fileCsvToPostgresOperator— load a CSV file into a PostgreSQL table🔗 Repo: https://github.com/Redevil10/airflow-postgres-csv
📦 PyPI: https://pypi.org/project/airflow-postgres-csv/
Why I built this
In my day-to-day work I frequently need to export query results to CSV for downstream consumers, or bulk-load CSVs into staging tables. The existing
PostgresOperatordoesn't natively support CSV I/O, and writingCOPYboilerplate in every DAG gets repetitive. I wanted something that:COPYfor maximum throughput (not row-by-row inserts)cursor.mogrify.sqlfilesapache-airflow-providers-postgres >= 6.0.0Quick example
Features
compression="gzip"on either operator for.csv.gzfilessqland it loads the file automaticallycsv_file_path,table_name, andsqlall support Jinja templatingQuality
Install
This is my first package published to PyPI, so I'm sure there are things I could do better — whether it's packaging, API design, testing approach, or anything else. I'd really appreciate any feedback or suggestions. Happy to accept contributions too!
Beta Was this translation helpful? Give feedback.
All reactions