Skip to content

Unexpected behavior when reading/writing from the same connection #227

Closed
@ayhernandez-godaddy

Description

@ayhernandez-godaddy

Version: 1.3.0-alpha.21

How to reproduce

const { DuckDBInstance } = require('@duckdb/node-api');

(async () => {
  const instance = await DuckDBInstance.create(':memory:');
  const db = await instance.connect();
  await db.run('CREATE TABLE IF NOT EXISTS progress (last_processed_item INTEGER)');

  const stmt = await db.prepare('FROM range(4000)');
  const pending = await stmt.startStream();
  const result = await pending.getResult();

  let count = 0;
  while (true) {
    const chunk = await result.fetchChunk();
    if (!chunk) {
      break;
    }

    const rows = chunk.getRows();
    if (!rows.length) {
      break;
    }

    await db.run('INSERT INTO progress VALUES (?)', [rows.at(-1)[0]]);
    count += rows.length;
  }
  console.log('Count:', count);
})();

Expected Behavior

The script prints Count: 4000

Actual Behavior

The script prints Count: 2048

I assume this is related to a comment here that states each connection can only have a single active query at a time. If the INSERT statement is removed the script runs to completion as expected. Additionally, if a second connection is created solely for the purpose of INSERTing, the script also runs to completion as expected.

I don't think I saw this covered in the docs and I found this behavior to be rather surprising. So I'm looking for clarification if this is indeed the expected behavior of the library.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions