Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
154 changes: 154 additions & 0 deletions docs/how-tos/my_airflow/use-my-airflow-api.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,154 @@
---
title: Access the My Airflow API
sidebar_position: 78
---
# How to use the My Airflow API

:::warning
The API allows you to view secrets values in plain text. Always exercise the principle of least privilege.
:::

This guide walks you through configuring API access for your personal My Airflow instance.

### Step 1: Navigate to User Settings

Click on your avatar in the top right corner and select `Settings`.

### Step 2: Select the My Airflow API tab

In the left sidebar, click on `My Airflow API`. You will see a list of environments where you have My Airflow enabled.

![My Airflow API Keys](assets/my_airflow_api_keys.png)

### Step 3: Copy the API URL

For your target environment, copy the `My Airflow API URL`.

### Step 4: Generate an API Key

1. Click "Manage API Keys" to expand the key management section
2. Optionally enter a name for your key (e.g., "My Script", "CI/CD Token")
3. Click "Generate New API Key"
4. Copy your API key immediately - it will not be shown again

### Step 5: Add your credentials to a .env file

Create a `.env` file inside your `orchestrate/` directory and be sure to add the file to your `.gitignore`. Add your credentials there.

```env
MY_AIRFLOW_API_URL = "https://your-slug-api-airflow-env.domain.com/api/v1/"
MY_AIRFLOW_API_KEY = "your-api-key-here"
```

### Step 6: Use it in a Python script

Below is a sample script that makes use of the Airflow API.

**This script does the following:**
- Initializes the Airflow API client using authentication details from environment variables.
- Fetches a list of all DAGs from the Airflow API.
- Prints a sorted list of DAG IDs for better readability.
- Triggers a new DAG run for a specified DAG using the API.
- Updates an Airflow dataset using the API.
- Handles API errors and retries requests if necessary.

```python
# airflow_api_call.py
import requests
import os
import json
from dotenv import load_dotenv

load_dotenv()
API_URL = os.getenv("MY_AIRFLOW_API_URL")
API_KEY = os.getenv("MY_AIRFLOW_API_KEY")

def update_dataset(name):
url = f"{API_URL}/datasets/events"

response = requests.post(
url=url,
headers={
"Authorization": f"Token {API_KEY}",
},
json={"dataset_uri": "upstream_data"}
)
return response.json()


def trigger_dag(dag_id):
url = f"{API_URL}/dags/{dag_id}/dagRuns"

response = requests.post(
url=url,
headers={
"Authorization": f"Token {API_KEY}",
},
json={"note": "Trigger from API"}
)
return response.json()


def list_dags():
url = f"{API_URL}/dags"

response = requests.get(
url=url,
headers={
"Authorization": f"Token {API_KEY}",
"Content-Type": "application/json",
},
)

# Extract just the DAG names from the response
dags_data = response.json()
dag_names = [dag['dag_id'] for dag in dags_data['dags']]

# Sort the names alphabetically for better readability
dag_names.sort()
return dag_names


def print_response(response):
if response:
msg = json.dumps(response, indent=2)
print(f"Event posted successfully:\n{'='*30}\n\n {msg}")


if __name__ == "__main__":

# Update an Airflow Dataset
# dataset_name = "upstream_data"
# response = update_dataset(dataset_name)
# print_response(response)

# Trigger a DAG
# dag_id = "bad_variable_usage"
# response = trigger_dag(dag_id)
# print_response(response)

# List DAGs
response = list_dags()
print_response(response)
```

:::note
Your API keys work for both My Airflow and Team Airflow (if you have access). The token is validated based on your user permissions.
:::

## Alternative: Generate keys via CLI

You can also manage API keys directly from the terminal using the `datacoves` CLI:

```bash
# List existing keys
datacoves my api-key list

# Generate a new key
datacoves my api-key generate --name "My Script"

# Delete a key (use the prefix shown in list)
datacoves my api-key delete abc12345
```

See [Datacoves CLI Commands](/reference/airflow/datacoves-commands.md#datacoves-my-api-key) for more details.
39 changes: 39 additions & 0 deletions docs/reference/airflow/datacoves-commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -14,6 +14,7 @@ The `my` subcommand executes commands for My Airflow.
Currently, the `datacoves my` subcommand has the following subcommands:
- `my import`
- `my pytest`
- `my api-key`

### datacoves my import

Expand All @@ -38,3 +39,41 @@ This command allows you to run pytest validations straight from the command line
```bash
datacoves my pytest -- orchestrate/test_dags/validate_dags.py
```

### datacoves my api-key

Manage My Airflow API keys from the command line. These keys allow you to access the My Airflow API programmatically.

#### List existing keys

```bash
datacoves my api-key list
```

This displays all environments where My Airflow is enabled, along with the API URL and any existing keys.

#### Generate a new key

```bash
datacoves my api-key generate
```

You can optionally provide a name for the key:

```bash
datacoves my api-key generate --name "My Script"
```

The command will output the API URL and key. Save the key immediately as it won't be shown again.

#### Delete a key

```bash
datacoves my api-key delete <token-prefix>
```

Use the first 8 characters of the token (shown in the list command) to identify which key to delete.

```bash
datacoves my api-key delete abc12345
```