Skip to content

Update docs - start #3426

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 5 commits into from
Apr 23, 2025
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions docs/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# node-postgres docs website

This is the documentation for node-postgres which is currently hosted at [https://node-postgres.com](https://node-postgres.com).

## Development

To run the documentation locally, you need to have [Node.js](https://nodejs.org) installed. Then, you can clone the repository and install the dependencies:

```bash
cd docs
yarn
```

Once you've installed the deps, you can run the development server:

```bash
yarn dev
```

This will start a local server at [http://localhost:3000](http://localhost:3000) where you can view the documentation and see your changes.
11 changes: 11 additions & 0 deletions docs/components/logo.tsx
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
import React from 'react'

type Props = {
src: string
alt?: string
}

export function Logo(props: Props) {
const alt = props.alt || 'Logo'
return <img src={props.src} alt={alt} width={100} height={100} style={{ width: 400, height: 'auto' }} />
}
15 changes: 5 additions & 10 deletions docs/pages/apis/client.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -33,8 +33,7 @@ type Config = {
example to create a client with specific connection information:
```js
import pg from 'pg'
const { Client } = pg
import { Client } from 'pg'

const client = new Client({
user: 'database-user',
Expand All @@ -48,8 +47,7 @@ const client = new Client({
## client.connect

```js
import pg from 'pg'
const { Client } = pg
import { Client } from 'pg'
const client = new Client()

await client.connect()
Expand Down Expand Up @@ -91,8 +89,7 @@ client.query(text: string, values?: any[]) => Promise<Result>
**Plain text query**

```js
import pg from 'pg'
const { Client } = pg
import { Client } from 'pg'
const client = new Client()

await client.connect()
Expand All @@ -106,8 +103,7 @@ await client.end()
**Parameterized query**

```js
import pg from 'pg'
const { Client } = pg
import { Client } from 'pg'
const client = new Client()

await client.connect()
Expand Down Expand Up @@ -145,8 +141,7 @@ await client.end()
If you pass an object to `client.query` and the object has a `.submit` function on it, the client will pass it's PostgreSQL server connection to the object and delegate query dispatching to the supplied object. This is an advanced feature mostly intended for library authors. It is incidentally also currently how the callback and promise based queries above are handled internally, but this is subject to change. It is also how [pg-cursor](https://github.com/brianc/node-pg-cursor) and [pg-query-stream](https://github.com/brianc/node-pg-query-stream) work.

```js
import pg from 'pg'
const { Query } = pg
import { Query } from 'pg'
const query = new Query('select $1::text as name', ['brianc'])

const result = client.query(query)
Expand Down
14 changes: 5 additions & 9 deletions docs/pages/apis/cursor.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -18,8 +18,7 @@ $ npm install pg pg-cursor
Instantiates a new Cursor. A cursor is an instance of `Submittable` and should be passed directly to the `client.query` method.

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'
import Cursor from 'pg-cursor'

const pool = new Pool()
Expand All @@ -29,11 +28,9 @@ const values = [10]

const cursor = client.query(new Cursor(text, values))

cursor.read(100, (err, rows) => {
cursor.close(() => {
client.release()
})
})
const { rows } = await cursor.read(100)
console.log(rows.length) // 100 (unless the table has fewer than 100 rows)
client.release()
```

```ts
Expand All @@ -58,8 +55,7 @@ If the cursor has read to the end of the result sets all subsequent calls to cur
Here is an example of reading to the end of a cursor:

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'
import Cursor from 'pg-cursor'

const pool = new Pool()
Expand Down
18 changes: 6 additions & 12 deletions docs/pages/apis/pool.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -48,8 +48,7 @@ type Config = {
example to create a new pool with configuration:
```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool({
host: 'localhost',
Expand All @@ -69,8 +68,7 @@ pool.query(text: string, values?: any[]) => Promise<pg.Result>
```

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand Down Expand Up @@ -102,8 +100,7 @@ Acquires a client from the pool.
- If the pool is 'full' and all clients are currently checked out will wait in a FIFO queue until a client becomes available by it being released back to the pool.

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand All @@ -121,8 +118,7 @@ Client instances returned from `pool.connect` will have a `release` method which
The `release` method on an acquired client returns it back to the pool. If you pass a truthy value in the `destroy` parameter, instead of releasing the client to the pool, the pool will be instructed to disconnect and destroy this client, leaving a space within itself for a new client.

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand All @@ -134,8 +130,7 @@ client.release()
```

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()
assert(pool.totalCount === 0)
Expand Down Expand Up @@ -168,8 +163,7 @@ Calling `pool.end` will drain the pool of all active clients, disconnect them, a

```js
// again both promises and callbacks are supported:
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand Down
4 changes: 3 additions & 1 deletion docs/pages/features/_meta.json
Original file line number Diff line number Diff line change
Expand Up @@ -5,5 +5,7 @@
"transactions": "Transactions",
"types": "Data Types",
"ssl": "SSL",
"native": "Native"
"native": "Native",
"esm": "ESM",
"callbacks": "Callbacks"
}
39 changes: 39 additions & 0 deletions docs/pages/features/callbacks.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,39 @@
---
title: Callbacks
---

## Callback Support

`async` / `await` is the preferred way to write async code these days with node, but callbacks are supported in the `pg` module and the `pg-pool` module. To use them, pass a callback function as the last argument to the following methods & it will be called and a promise will not be returned:


```js
const { Pool, Client } = require('pg')

// pool
const pool = new Pool()
// run a query on an available client
pool.query('SELECT NOW()', (err, res) => {
console.log(err, res)
})

// check out a client to do something more complex like a transaction
pool.connect((err, client, release) => {
client.query('SELECT NOW()', (err, res) => {
release()
console.log(err, res)
pool.end()
})

})

// single client
const client = new Client()
client.connect((err) => {
if (err) throw err
client.query('SELECT NOW()', (err, res) => {
console.log(err, res)
client.end()
})
})
```
37 changes: 37 additions & 0 deletions docs/pages/features/esm.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,37 @@
---
title: ESM
---

## ESM Support

As of v8.15.x node-postgres supporters the __ECMAScript Module__ (ESM) format. This means you can use `import` statements instead of `require` or `import pg from 'pg'`.

CommonJS modules are still supported. The ESM format is an opt-in feature and will not affect existing codebases that use CommonJS.

The docs have been changed to show ESM usage, but in a CommonJS context you can still use the same code, you just need to change the import format.

If you're using CommonJS, you can use the following code to import the `pg` module:

```js
const pg = require('pg')
const { Client } = pg
// etc...
```

### ESM Usage

If you're using ESM, you can use the following code to import the `pg` module:

```js
import { Client } from 'pg'
// etc...
```


Previously if you were using ESM you would have to use the following code:

```js
import pg from 'pg'
const { Client } = pg
// etc...
```
3 changes: 1 addition & 2 deletions docs/pages/features/ssl.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,7 @@ const config = {
},
}

import pg from 'pg'
const { Client, Pool } = pg
import { Client, Pool } from 'pg'

const client = new Client(config)
await client.connect()
Expand Down
3 changes: 1 addition & 2 deletions docs/pages/features/transactions.mdx
Original file line number Diff line number Diff line change
Expand Up @@ -16,8 +16,7 @@ To execute a transaction with node-postgres you simply execute `BEGIN / COMMIT /
## Examples

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'
const pool = new Pool()

const client = await pool.connect()
Expand Down
3 changes: 1 addition & 2 deletions docs/pages/guides/async-express.md
Original file line number Diff line number Diff line change
Expand Up @@ -22,8 +22,7 @@ That's the same structure I used in the [project structure](/guides/project-stru
My `db/index.js` file usually starts out like this:

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand Down
13 changes: 5 additions & 8 deletions docs/pages/guides/project-structure.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,13 +27,12 @@ The location doesn't really matter - I've found it usually ends up being somewha
Typically I'll start out my `db/index.js` file like so:

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

export const query = (text, params, callback) => {
return pool.query(text, params, callback)
export const query = (text, params) => {
return pool.query(text, params)
}
```

Expand All @@ -55,8 +54,7 @@ app.get('/:id', async (req, res, next) => {
Imagine we have lots of routes scattered throughout many files under our `routes/` directory. We now want to go back and log every single query that's executed, how long it took, and the number of rows it returned. If we had required node-postgres directly in every route file we'd have to go edit every single route - that would take forever & be really error prone! But thankfully we put our data access into `db/index.js`. Let's go add some logging:

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand All @@ -76,8 +74,7 @@ _note: I didn't log the query parameters. Depending on your application you migh
Now what if we need to check out a client from the pool to run several queries in a row in a transaction? We can add another method to our `db/index.js` file when we need to do this:

```js
import pg from 'pg'
const { Pool } = pg
import { Pool } from 'pg'

const pool = new Pool()

Expand Down
Loading