This repo contains resources for deploying a development Airflow instance with an SQLite metadata backend using Docker Compose. The ./airflow/docker-compose.yaml is a modified version of the official Airflow docker-compose.yml and is based on instructions from the official Airflow docs for Running Airflow in Docker.
Note: These resources are based on Airflow version 2.8.0. However, I imagine the instructions can apply to all versions 2.x.x with minor modifications.
- The resources in this repo are purely for testing and development purposes and should not be used for production.
- You can only use
SequentialExecutorwith SQLite. If you require a different executor, you should refer to the official Airflow docs for Running Airflow in Docker.
- Clone the repo. For example:
git clone https://github.com/kingsabru/airflow-sqlite-docker-compose.git
Run the init.sh script to create the supporting folders and .env file.
- Make
init.shexecutable:chmod +x init.sh - Run
init.sh:./init.sh
- Start docker containers:
docker compose up -d
- Update the
docker-compose.yml. Comment the lineimage: ${AIRFLOW_IMAGE_NAME:-apache/airflow:2.8.0}and uncomment the linebuild: .. reference - (Optional) Update the
Dockerfileas necessary. A common change you can make is to modify the Airflow version. - (Optional) Add custom Python packages to the
requirements.txtfile. Airflow plugins hosted on Pypi can also be added here. - Build a custom Docker image:
docker compose build - Start docker containers:
docker compose up -d. Alternatively, you can build and start the container automatically using the command:docker compose up --build -d
- Go to
http://localhost:8080 - Login with the username and password. The default username and password are
airflowandairflow, respectively.
- Go to the menu directory: Admin > Connections
- Click on the
+icon to add a new record. - For
Connection Type, choose the type of connection from the drop-down. - Input a
Connection Idvalue. This will be the name of the connection. Example: clickhouse_conn - Fill in the rest of the fields as necessary.
New DAGs should be created in the ./dags folder. Simply create a .py Python file and write the logic for the DAG. The result will be visible on the Airflow UI homepage when the file is saved.
- Destroy Docker container:
docker compose down --volumes --remove-orphans - Delete the
./db/airflow.dbfile.
- Build custom Airflow Image:
docker compose build - Spin Up Docker Containers:
docker compose up -d - Stop Docker Containers:
docker compose stop - Start stopped Docker Containers:
docker compose start - Destroy Docker Containers:
docker compose down --volumes --remove-orphans
If you have any suggestions, bug reports, or want to contribute to this userscript, feel free to create issues or pull requests in this GitHub repository.
This project is licensed under the MIT License.