Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/athena.md
Original file line number Diff line number Diff line change
Expand Up @@ -8,6 +8,12 @@ keywords: [aws, athena, glue catalog]

The athena destination stores data as parquet files in s3 buckets and creates [external tables in aws athena](https://docs.aws.amazon.com/athena/latest/ug/creating-tables.html). You can then query those tables with athena sql commands which will then scan the whole folder of parquet files and return the results. This destination works very similar to other sql based destinations, with the exception of the merge write disposition not being supported at this time. dlt metadata will be stored in the same bucket as the parquet files, but as iceberg tables. Athena additionally supports writing individual data tables as iceberg tables, so the may be manipulated later, a common use-case would be to strip gdpr data from them.

## Install dlt with Athena
**To install the DLT library with Athena dependencies:**
```
pip install dlt[athena]
```

## Setup Guide
### 1. Initialize the dlt project

Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/bigquery.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [bigquery, destination, data warehouse]

# Google BigQuery

## Install dlt with BigQuery
**To install the DLT library with BigQuery dependencies:**
```
pip install dlt[bigquery]
```

## Setup Guide

**1. Initalize a project with a pipeline that loads to BigQuery by running**
Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/duckdb.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [duckdb, destination, data warehouse]

# DuckDB

## Install dlt with DuckDB
**To install the DLT library with DuckDB dependencies:**
```
pip install dlt[duckdb]
```

## Setup Guide

**1. Initialize a project with a pipeline that loads to DuckDB by running**
Expand Down
31 changes: 20 additions & 11 deletions docs/website/docs/dlt-ecosystem/destinations/filesystem.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,27 @@ Filesystem destination stores data in remote file systems and bucket storages li

> 💡 Please read the notes on the layout of the data files. Currently we are getting feedback on it. Please join our slack (icon at the top of the page) and help us to find the optimal layout.

## Install dlt with filesystem
**To install the DLT library with filesystem dependencies:**
```
pip install dlt[filesystem]
```

This installs `s3fs` and `botocore` packages.

:::caution

You may also install the dependencies independently.
Try:
```sh
pip install dlt
pip install s3fs
```
so pip does not fail on backtracking.
:::

## Setup Guide

### 1. Initialize the dlt project

Let's start by initializing a new dlt project as follows:
Expand All @@ -19,17 +39,6 @@ The command above creates sample `secrets.toml` and requirements file for AWS S3
```
pip install -r requirements.txt
```
or with `pip install dlt[filesystem]` which will install `s3fs` and `botocore` packages.
:::caution

You may also install the dependencies independently
try
```sh
pip install dlt
pip install s3fs
```
so pip does not fail on backtracking
:::

To edit the `dlt` credentials file with your secret info, open `.dlt/secrets.toml`, which looks like this:
```toml
Expand Down
7 changes: 6 additions & 1 deletion docs/website/docs/dlt-ecosystem/destinations/motherduck.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,9 +5,14 @@ keywords: [MotherDuck, duckdb, destination, data warehouse]
---

# MotherDuck

> 🧪 MotherDuck is still invitation only and intensively tested. Please see the limitations / problems at the end.

## Install dlt with MotherDuck
**To install the DLT library with MotherDuck dependencies:**
```
pip install dlt[motherduck]
```

:::tip
Decrease the number of load workers to 3-5 depending on the quality of your internet connection if you see a lot of retries in your logs with various timeout, add the following to your `config.toml`:
```toml
Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/mssql.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [mssql, sqlserver, destination, data warehouse]

# Microsoft SQL Server

## Install dlt with MS SQL
**To install the DLT library with MS SQL dependencies:**
```
pip install dlt[mssql]
```

## Setup guide

### Prerequisites
Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/postgres.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [postgres, destination, data warehouse]

# Postgres

## Install dlt with PostgreSQL
**To install the DLT library with PostgreSQL dependencies:**
```
pip install dlt[postgres]
```

## Setup Guide

**1. Initialize a project with a pipeline that loads to Postgres by running**
Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/redshift.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [redshift, destination, data warehouse]

# Amazon Redshift

## Install dlt with Redshift
**To install the DLT library with Redshift dependencies:**
```
pip install dlt[redshift]
```

## Setup Guide
### 1. Initialize the dlt project

Expand Down
6 changes: 6 additions & 0 deletions docs/website/docs/dlt-ecosystem/destinations/snowflake.md
Original file line number Diff line number Diff line change
Expand Up @@ -6,6 +6,12 @@ keywords: [Snowflake, destination, data warehouse]

# Snowflake

## Install dlt with Snowflake
**To install the DLT library with Snowflake dependencies:**
```
pip install dlt[snowflake]
```

## Setup Guide

**1. Initialize a project with a pipeline that loads to snowflake by running**
Expand Down