Skip to content

Create "db export" and "db import" commands #302

Open
@kostyay

Description

@kostyay

We have an open source project that uses baur (https://github.com/stackpulse/steps).
Our entire build system is based on baur.
Currently our CI is configured not to run forked builds. This is because we don't want to let everyone access our postgres database which contains baur data.
Ideally I want to build forked branches too, but I don't want to let users connect to my postgres db.
My idea to solve this is as follows:

  1. When master branch builds one of the steps will be to export baurs db to a public S3 bucket.
  2. A forked build will set up postgres inside a docker container and import the database dump into the temporary database. This way the build will still enjoy the benefits of baur (only changes will be built)
  3. Same for local builds, developers would run a script that would copy the baur db dump from public s3 and set up postgres in local docker container
  4. All of above I can do already with postgres tools and my CI configuration.

However, I'm interested in the smallest db footprint possible. I don't care of older build digests, I only want to get the file digests from the most recent build without the historical data. This is where I would like to have baur db dump feature
You could run baur db export all for full dump (same as pg_dump for example) or baur db export minimal for only the latest build data + digest. This will be the minimal amount of data required to build only the modified applications.
It can be CSV format or JSON or whatever, not necessary postgres dump.

What do you think?

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions