Skip to content

stream.pipeline swallowing errors when read stream is empty #24517

Closed
@coderaiser

Description

@coderaiser
  • Version: 10.12.0, 11.2.0
  • Platform: Linux cloudcmd.io 4.4.0-122-generic #146-Ubuntu SMP Mon Apr 23 15:34:04 UTC 2018 x86_64 x86_64 x86_64 GNU/Linux
  • Subsystem: Stream

pipeline has inconsistent behavior with pipe, it is swallows writable errors, when readable stream is empty, for example, such code will log undefined:

const {Readable} = require('stream');
const {createWriteStream} = require('fs');
const {pipeline} = require('stream');

const readStream = new Readable({
  read() {}
});

readStream.push(null);

pipeline(readStream, createWriteStream('/'), (e) => {
  console.log(e);
  // outputs
  undefined
});

But If we use pipe with a code:

const {Readable} = require('stream');
const {createWriteStream} = require('fs');
const {pipeline} = require('stream');

const inStream = new Readable({
  read() {}
});

readStream.push(null);
readStream.pipe(createWriteStream('/'));

We will get such an error:

events.js:167
      throw er; // Unhandled 'error' event
      ^

Error: EISDIR: illegal operation on a directory, open '/'
Emitted 'error' event at:
    at lazyFs.open (internal/fs/streams.js:273:12)
    at FSReqWrap.oncomplete (fs.js:141:20)

Would be great if pipeline has the same behavior pipe has :).

Metadata

Metadata

Assignees

No one assigned

    Labels

    help wantedIssues that need assistance from volunteers or PRs that need help to proceed.streamIssues and PRs related to the stream subsystem.

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions