Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
1 change: 1 addition & 0 deletions content/_redirects
Original file line number Diff line number Diff line change
Expand Up @@ -214,6 +214,7 @@
/constellation/ /workers-ai/ 301

# D1
/d1/changelog/ /d1/platform/changelog/ 301
/d1/client-api/ /d1/build-databases/query-databases/ 301
/d1/learning/using-d1-from-pages/ /pages/functions/bindings/#d1-databases 301
/d1/learning/debug-d1/ /d1/observability/debug-d1/ 301
Expand Down
29 changes: 20 additions & 9 deletions content/d1/build-databases/import-data.md
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@ insert into users (id, full_name, created_on) values ('01GREFXCNF67KV7FPPSEJVJME
With your `users_export.sql` file in the current working directory, you can pass the `--file=users_export.sql` flag to `d1 execute` to execute (import) our table schema and values:

```sh
$ wrangler d1 execute example-db --file=users_export.sql
$ wrangler d1 execute example-db --remote --file=users_export.sql

🌀 Mapping SQL input into an array of statements
🌀 Parsing 1 statements
Expand All @@ -51,7 +51,7 @@ $ wrangler d1 execute example-db --file=users_export.sql
To confirm your table was imported correctly and is queryable, execute a `SELECT` statement against your `users` table directly:

```sh
$ wrangler d1 execute example-db --command "SELECT * FROM users LIMIT 100;"
$ wrangler d1 execute example-db --remote --command "SELECT * FROM users LIMIT 100;"

🌀 Mapping SQL input into an array of statements
🌀 Parsing 1 statements
Expand Down Expand Up @@ -92,19 +92,30 @@ For example, if you have a raw SQLite dump called `db_dump.sqlite3`, run the fol
$ sqlite3 db_dump.sqlite3 .dump > db.sql
```

Once you have run the above command, you will need to edit the output SQL file to be compatible with D1:

1. Remove `BEGIN TRANSACTION` and `COMMIT;` from the file
2. Remove the following table creation statement (if present):
```sql
CREATE TABLE _cf_KV (
key TEXT PRIMARY KEY,
value BLOB
) WITHOUT ROWID;
```

You can then follow the steps to [import an existing database](#import-an-existing-database) into D1 by using the `.sql` file you generated from the database dump as the input to `wrangler d1 execute`.

## Troubleshooting

If you receive an error when trying to import an existing schema and/or dataset into D1:

* Ensure you are importing data in SQL format (typically with a `.sql` file extension). See [how to convert SQLite files](#convert-sqlite-database-files) if you have a `.sqlite3` database dump.
* Make sure the schema is [SQLite3](https://www.sqlite.org/docs.html) compatible. You cannot import data from a MySQL or PostgreSQL database into D1, as the types and SQL syntax are not directly compatible.
* If you have foreign key relationships between tables, ensure you are importing the tables in the right order. You can't refer to a table that doesn't yet exist.
* If you get `"cannot start a transaction within a transaction"`, make sure you have removed `BEGIN TRANSACTION` and `COMMIT` from your dumped SQL statements.
- Ensure you are importing data in SQL format (typically with a `.sql` file extension). Refer to [how to convert SQLite files](#convert-sqlite-database-files) if you have a `.sqlite3` database dump.
- Make sure the schema is [SQLite3](https://www.sqlite.org/docs.html) compatible. You cannot import data from a MySQL or PostgreSQL database into D1, as the types and SQL syntax are not directly compatible.
- If you have foreign key relationships between tables, ensure you are importing the tables in the right order. You cannot refer to a table that does not yet exist.
- If you receive a `"cannot start a transaction within a transaction"` error, make sure you have removed `BEGIN TRANSACTION` and `COMMIT` from your dumped SQL statements.

## Next Steps

* Read the SQLite [`CREATE TABLE`](https://www.sqlite.org/lang_createtable.html) documentation
* Learn how to [use the D1 client API](/d1/build-databases/query-databases/) from within a Worker
* Understand how [database migrations work](/d1/reference/migrations/) with D1
- Read the SQLite [`CREATE TABLE`](https://www.sqlite.org/lang_createtable.html) documentation.
- Learn how to [use the D1 client API](/d1/build-databases/query-databases/) from within a Worker.
- Understand how [database migrations work](/d1/reference/migrations/) with D1.
10 changes: 5 additions & 5 deletions content/d1/get-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,8 +85,7 @@ For reference, a good database name is:
- Typically a combination of ASCII characters, shorter than 32 characters, and uses dashes (-) instead of spaces.
- Descriptive of the use-case and environment. For example, "staging-db-web" or "production-db-backend".
- Only used for describing the database, and is not directly referenced in code.
{{</Aside>}}

{{</Aside>}}

```sh
$ npx wrangler d1 create prod-d1-tutorial
Expand Down Expand Up @@ -159,7 +158,8 @@ Then validate your data is in your database by running:
$ npx wrangler d1 execute prod-d1-tutorial --local --command="SELECT * FROM Customers"
```

You should see the following output:
You should see the following output:

```sh
🌀 Mapping SQL input into an array of statements
🌀 Executing on local database production-db-backend (5f092302-3fbd-4247-a873-bf1afc5150b) from .wrangler/state/v3/d1:
Expand Down Expand Up @@ -244,13 +244,13 @@ To deploy your Worker to production, you must first repeat the [database bootstr
First, bootstrap your database with the `schema.sql` file you created in step 4:

```sh
$ npx wrangler d1 execute prod-d1-tutorial --file=./schema.sql
$ npx wrangler d1 execute prod-d1-tutorial --remote --file=./schema.sql
```

Then validate the data is in production by running:

```sh
$ npx wrangler d1 execute prod-d1-tutorial --command="SELECT * FROM Customers"
$ npx wrangler d1 execute prod-d1-tutorial --remote --command="SELECT * FROM Customers"
```

Finally, deploy your Worker to make your project accessible on the Internet. To deploy your Worker, run:
Expand Down
7 changes: 4 additions & 3 deletions content/d1/tutorials/build-a-comments-api/index.md
Original file line number Diff line number Diff line change
Expand Up @@ -33,6 +33,7 @@ $ npm install hono
## 2. Initialize your Hono application

In `src/worker.js`, initialize a new Hono application, and define the following endpoints:

- `GET /api/posts/:slug/comments`.
- `POST /api/posts/:slug/comments`.

Expand Down Expand Up @@ -84,7 +85,7 @@ With your binding configured in your `wrangler.toml` file, you can interact with
Interact with D1 by issuing direct SQL commands using `wrangler d1 execute`:

```sh
$ wrangler d1 execute d1-example --command "SELECT name FROM sqlite_schema WHERE type ='table'"
$ wrangler d1 execute d1-example --remote --command "SELECT name FROM sqlite_schema WHERE type ='table'"

Executing on d1-example:

Expand Down Expand Up @@ -118,7 +119,7 @@ CREATE INDEX idx_comments_post_slug ON comments (post_slug);
With the file created, execute the schema file against the D1 database by passing it with the flag `--file`:

```sh
$ wrangler d1 execute d1-example --file schemas/schema.sql
$ wrangler d1 execute d1-example --remote --file schemas/schema.sql
```

## 5. Execute SQL
Expand Down Expand Up @@ -212,7 +213,7 @@ $ curl https://d1-example.signalnerve.workers.dev/api/posts/hello-world/comments

This application is an API back-end, best served for use with a front-end UI for creating and viewing comments. To test this back-end with a prebuild front-end UI, refer to the example UI in the [example-frontend directory](https://github.com/cloudflare/workers-sdk/tree/main/templates/worker-d1-api/example-frontend). Notably, the [`loadComments` and `submitComment` functions](https://github.com/cloudflare/workers-sdk/tree/main/templates/worker-d1-api/example-frontend/src/views/PostView.vue#L57-L82) make requests to a deployed version of this site, meaning you can take the frontend and replace the URL with your deployed version of the codebase in this tutorial to use your own data.

Interacting with this API from a front-end will require enabling specific Cross-Origin Resource Sharing (or *CORS*) headers in your back-end API. Hono allows you to enable Cross-Origin Resource Sharing for your application. Import the `cors` module and add it as middleware to your API in `src/worker.js`:
Interacting with this API from a front-end will require enabling specific Cross-Origin Resource Sharing (or _CORS_) headers in your back-end API. Hono allows you to enable Cross-Origin Resource Sharing for your application. Import the `cors` module and add it as middleware to your API in `src/worker.js`:

```typescript
---
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,8 +43,8 @@ For this guide, set up a basic Worker:

You will be asked if you would like to deploy the project to Cloudflare.

* If you choose to deploy, you will be asked to authenticate (if not logged in already), and your project will be deployed to the Cloudflare global network.
* If you choose not to deploy, go to the newly created project directory to begin writing code. Deploy your project by following the instructions in [step 4](/workers/get-started/guide/#4-deploy-your-project).
- If you choose to deploy, you will be asked to authenticate (if not logged in already), and your project will be deployed to the Cloudflare global network.
- If you choose not to deploy, go to the newly created project directory to begin writing code. Deploy your project by following the instructions in [step 4](/workers/get-started/guide/#4-deploy-your-project).

In your project directory, C3 has generated the following:

Expand Down Expand Up @@ -180,13 +180,13 @@ database_id = "abc-def-geh"
In this application, we'll create a `notes` table in D1, which will allow us to store notes and later retrieve them in Vectorize. To create this table, run a SQL command using `wrangler d1 execute`:

```sh
$ npx wrangler d1 execute database --command "CREATE TABLE IF NOT EXISTS notes (id INTEGER PRIMARY KEY, text TEXT NOT NULL)"
$ npx wrangler d1 execute database --remote --command "CREATE TABLE IF NOT EXISTS notes (id INTEGER PRIMARY KEY, text TEXT NOT NULL)"
```

Now, we can add a new note to our database using `wrangler d1 execute`:

```sh
$ npx wrangler d1 execute database --command "INSERT INTO notes (text) VALUES ('The best pizza topping is pepperoni')"
$ npx wrangler d1 execute database --remote --command "INSERT INTO notes (text) VALUES ('The best pizza topping is pepperoni')"
```

## 5. Creating notes and adding them to Vectorize
Expand Down Expand Up @@ -372,9 +372,9 @@ When pushing to your `*.workers.dev` subdomain for the first time, you may see [

To do more:

* Review Cloudflare's [AI documentation](/workers-ai).
* Review [Tutorials](/workers/tutorials/) to build projects on Workers.
* Explore [Examples](/workers/examples/) to experiment with copy and paste Worker code.
* Understand how Workers works in [Reference](/workers/reference/).
* Learn about Workers features and functionality in [Platform](/workers/platform/).
* Set up [Wrangler](/workers/wrangler/install-and-update/) to programmatically create, test, and deploy your Worker projects.
- Review Cloudflare's [AI documentation](/workers-ai).
- Review [Tutorials](/workers/tutorials/) to build projects on Workers.
- Explore [Examples](/workers/examples/) to experiment with copy and paste Worker code.
- Understand how Workers works in [Reference](/workers/reference/).
- Learn about Workers features and functionality in [Platform](/workers/platform/).
- Set up [Wrangler](/workers/wrangler/install-and-update/) to programmatically create, test, and deploy your Worker projects.
26 changes: 17 additions & 9 deletions content/workers/wrangler/commands.md
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,6 @@ You can add Wrangler commands that you use often as scripts in your project's `p

You can then run them using your package manager of choice:


{{<tabs labels="npm | yarn | pnpm">}}
{{<tab label="npm" default="true">}}

Expand Down Expand Up @@ -279,7 +278,7 @@ Execute a query on a D1 database.
wrangler d1 execute <DATABASE_NAME> [OPTIONS]
```

{{<Aside type="note">}}
{{<Aside type="note">}}

You must provide either `--command` or `--file` for this command to run successfully.

Expand All @@ -295,8 +294,10 @@ You must provide either `--command` or `--file` for this command to run successf
- Path to the SQL file you wish to execute.
- `-y, --yes` {{<type>}}boolean{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Answer `yes` to any prompts.
- `--local` {{<type>}}boolean{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- `--local` {{<type>}}boolean{{</type>}}{{<prop-meta>}}(default: true){{</prop-meta>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Execute commands/files against a local database for use with [wrangler dev](#dev).
- `--remote` {{<type>}}boolean{{</type>}} {{<prop-meta>}}(default: false){{</prop-meta>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Execute commands/files against a remote D1 database for use with [wrangler dev --remote](#dev).
- `--persist-to` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Specify directory to use for local persistence (for use in combination with `--local`).
- `--json` {{<type>}}boolean{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
Expand Down Expand Up @@ -484,8 +485,10 @@ wrangler d1 migrations apply <DATABASE_NAME> [OPTIONS]

- `DATABASE_NAME` {{<type>}}string{{</type>}} {{<prop-meta>}}required{{</prop-meta>}}
- The name of the D1 database you wish to apply your migrations on.
- `--local` {{<type>}}boolean{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- `--local` {{<type>}}boolean{{</type>}} {{<prop-meta>}}(default: true){{</prop-meta>}}{{<prop-meta>}}optional{{</prop-meta>}}
- Execute any unapplied migrations on your locally persisted D1 database.
- `--remote` {{<type>}}boolean{{</type>}} {{<prop-meta>}}(default: false){{</prop-meta>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Execute any unapplied migrations on your remote D1 database.
- `--persist-to` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Specify directory to use for local persistence (for use in combination with `--local`).
- `--preview` {{<type>}}boolean{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
Expand Down Expand Up @@ -628,6 +631,7 @@ wrangler vectorize get <INDEX_NAME>
- The name of the index to fetch details for.

{{</definitions>}}

### `list`

List all Vectorize indexes in your account, including the configured dimensions and distance metric.
Expand Down Expand Up @@ -1452,7 +1456,7 @@ wrangler r2 bucket sippy get <NAME>

- `NAME` {{<type>}}string{{</type>}} {{<prop-meta>}}required{{</prop-meta>}}
- The name of the R2 bucket to get the status of Sippy.

{{</definitions>}}

---
Expand Down Expand Up @@ -1516,7 +1520,7 @@ wrangler r2 object put <OBJECT_PATH> [OPTIONS]
- Interact with locally persisted data.
- `--persist-to` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Specify directory for locally persisted data.
{{</definitions>}}
{{</definitions>}}

### `delete`

Expand Down Expand Up @@ -1592,9 +1596,11 @@ wrangler secret delete <KEY> [OPTIONS]
{{<definitions>}}

- `KEY` {{<type>}}string{{</type>}} {{<prop-meta>}}required{{</prop-meta>}}

- The variable name for this secret to be accessed in the Worker.

- `--name` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}

- Perform on a specific Worker rather than inheriting from `wrangler.toml`.

- `--env` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
Expand Down Expand Up @@ -1647,15 +1653,17 @@ wrangler secret:bulk [<FILENAME>] [OPTIONS]
{{<definitions>}}

- `FILENAME` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}

- The JSON file containing key-value pairs to upload as secrets, in the form `{"SECRET_NAME": "secret value", ...}`.
- If omitted, Wrangler expects to receive input from `stdin` rather than a file.

- `--name` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}

- Perform on a specific Worker rather than inheriting from `wrangler.toml`.

- `--env` {{<type>}}string{{</type>}} {{<prop-meta>}}optional{{</prop-meta>}}
- Perform on a specific environment.

{{</definitions>}}

The following is an example of uploading secrets from a JSON file redirected to `stdin`. When complete, the output summary will show the number of secrets uploaded and the number of secrets that failed to upload.
Expand Down Expand Up @@ -1810,7 +1818,7 @@ wrangler pages project delete <PROJECT_NAME> [OPTIONS]
- Answer `"yes"` to confirmation prompt.

{{</definitions>}}

### `deployment list`

List deployments in your Cloudflare Pages project.
Expand Down Expand Up @@ -2400,6 +2408,7 @@ wrangler types [<PATH>] [OPTIONS]
{{<definitions>}}

- `PATH` {{<type>}}string{{</type>}} {{<prop-meta>}}(default: `worker-configuration.d.ts`){{</prop-meta>}}

- The path to where the declaration file for your Worker will be written.
- The path to the declaration file must have a `d.ts` extension.

Expand All @@ -2409,5 +2418,4 @@ wrangler types [<PATH>] [OPTIONS]

{{</definitions>}}


<!--TODO Add examples of DTS generated output -->
21 changes: 14 additions & 7 deletions data/changelogs/d1.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,13 +2,20 @@
link: "/d1/platform/changelog/"
productName: D1
entries:
- publish_date: "2024-03-12"
title: Change in `wrangler d1 execute` default
description: |-
As of `wrangler@3.33.0`, `wrangler d1 execute` and `wrangler d1 migrations apply` now default to using a local database, to match the default behavior of `wrangler dev`.

It is also now possible to specify one of `--local` or `--remote` to explicitly tell wrangler which environment you wish to run your commands against.

- publish_date: "2024-03-05"
title: Billing for D1 usage
description: |-
As of 2024-03-05, D1 usage will start to be counted and may incur charges for an account's future billing cycle.

Developers on the Workers Paid plan with D1 usage beyond [included limits](/d1/platform/pricing/#billing-metrics) will incur charges according to [D1's pricing](/d1/platform/pricing).

Developers on the Workers Free plan can use up to the included limits. Usage beyond the limits below requires signing up for the $5/month Workers Paid plan.

Account billable metrics are available in the [Cloudflare Dashboard](https://dash.cloudflare.com) and [GraphQL API](/d1/observability/metrics-analytics/#metrics).
Expand All @@ -17,7 +24,7 @@ entries:
title: API changes to `.run()`
description: |-
A previous change (made on 2024-02-13) to the `run()` [query statement method](/d1/build-databases/query-databases/#await-stmtrun) has been reverted.

`run()` now returns a `D1Result`, including the result rows, matching its original behaviour prior to the change on 2024-02-13.

Future change to `run()` to return a [`D1ExecResult`](/d1/build-databases/query-databases/#return-object), as originally intended and documented, will be gated behind a [compatibility date](/workers/configuration/compatibility-dates/) as to avoid breaking existing Workers relying on the way `run()` currently works.
Expand All @@ -26,19 +33,19 @@ entries:
title: API changes to `.raw()`, `.all()` and `.run()`
description: |-
D1's `raw()`, `all()` and `run()` [query statement methods](/d1/build-databases/query-databases/#query-statement-methods) have been updated to reflect their intended behaviour and improve compatibility with ORM libraries.

`raw()` now correctly returns results as an array of arrays, allowing the correct handling of duplicate column names (such as when joining tables), as compared to `all()`, which is unchanged and returns an array of objects. To include an array of column names in the results when using `raw()`, use `raw({columnNames: true})`.

`run()` no longer incorrectly returns a `D1Result` and instead returns a [`D1ExecResult`](/d1/build-databases/query-databases/#return-object) as originally intended and documented.

This may be a breaking change for some applications that expected `raw()` to return an array of objects.

Visit the [query databases documentation](/d1/build-databases/query-databases/) to review D1's query methods, return types and TypeScript support in detail.

- publish_date: "2024-01-18"
title: Support for LIMIT on UPDATE and DELETE statements
description: |-
D1 now supports adding a `LIMIT` clause to `UPDATE` and `DELETE` statements, which allows you to limit the impact of a potentially dangerous operation.
D1 now supports adding a `LIMIT` clause to `UPDATE` and `DELETE` statements, which allows you to limit the impact of a potentially dangerous operation.

- publish_date: "2023-12-18"
title: Legacy alpha automated backups disabled
Expand Down