Options
All
  • Public
  • Public/Protected
  • All
Menu

Namespace internal

Index

Type aliases

PipelineCallback<S>: S extends PipelineDestinationPromiseFunction<any, infer P> ? (err: ErrnoException | null, value: P) => void : (err: ErrnoException | null) => void

Type parameters

PipelineDestination<S, P>: S extends PipelineTransformSource<infer ST> ? WritableStream | PipelineDestinationIterableFunction<ST> | PipelineDestinationPromiseFunction<ST, P> : never

Type parameters

PipelineDestinationIterableFunction<T>: (source: AsyncIterable<T>) => AsyncIterable<any>

Type parameters

  • T

Type declaration

PipelineDestinationPromiseFunction<T, P>: (source: AsyncIterable<T>) => Promise<P>

Type parameters

  • T

  • P

Type declaration

PipelinePromise<S>: S extends PipelineDestinationPromiseFunction<any, infer P> ? Promise<P> : Promise<void>

Type parameters

Type parameters

  • T

PipelineSourceFunction<T>: () => Iterable<T> | AsyncIterable<T>

Type parameters

  • T

Type declaration

PipelineTransform<S, U>: ReadWriteStream | ((source: S extends (...args: any[]) => Iterable<infer ST> | AsyncIterable<infer ST> ? AsyncIterable<ST> : S) => AsyncIterable<U>)

Type parameters

PipelineTransformSource<T>: PipelineSource<T> | PipelineTransform<any, T>

Type parameters

  • T

TransformCallback: (error?: <internal>.Error | null, data?: any) => void

Type declaration

Variables

consumers: typeof "node:stream/consumers"
promises: typeof "node:stream/promises"

Functions

  • Attaches an AbortSignal to a readable or writeable stream. This lets code control stream destruction using an AbortController.

    Calling abort on the AbortController corresponding to the passedAbortSignal will behave the same way as calling .destroy(new AbortError())on the stream.

    const fs = require('fs');

    const controller = new AbortController();
    const read = addAbortSignal(
    controller.signal,
    fs.createReadStream(('object.json'))
    );
    // Later, abort the operation closing the stream
    controller.abort();

    Or using an AbortSignal with a readable stream as an async iterable:

    const controller = new AbortController();
    setTimeout(() => controller.abort(), 10_000); // set a timeout
    const stream = addAbortSignal(
    controller.signal,
    fs.createReadStream(('object.json'))
    );
    (async () => {
    try {
    for await (const chunk of stream) {
    await process(chunk);
    }
    } catch (e) {
    if (e.name === 'AbortError') {
    // The operation was cancelled
    } else {
    throw e;
    }
    }
    })();
    since

    v15.4.0

    Type parameters

    Parameters

    • signal: <internal>.AbortSignal

      A signal representing possible cancellation

    • stream: T

      a stream to attach a signal to

    Returns T

  • A function to get notified when a stream is no longer readable, writable or has experienced an error or a premature close event.

    const { finished } = require('stream');

    const rs = fs.createReadStream('archive.tar');

    finished(rs, (err) => {
    if (err) {
    console.error('Stream failed.', err);
    } else {
    console.log('Stream is done reading.');
    }
    });

    rs.resume(); // Drain the stream.

    Especially useful in error handling scenarios where a stream is destroyed prematurely (like an aborted HTTP request), and will not emit 'end'or 'finish'.

    The finished API provides promise version:

    const { finished } = require('stream/promises');

    const rs = fs.createReadStream('archive.tar');

    async function run() {
    await finished(rs);
    console.log('Stream is done reading.');
    }

    run().catch(console.error);
    rs.resume(); // Drain the stream.

    stream.finished() leaves dangling event listeners (in particular'error', 'end', 'finish' and 'close') after callback has been invoked. The reason for this is so that unexpected 'error' events (due to incorrect stream implementations) do not cause unexpected crashes. If this is unwanted behavior then the returned cleanup function needs to be invoked in the callback:

    const cleanup = finished(rs, (err) => {
    cleanup();
    // ...
    });
    since

    v10.0.0

    Parameters

    Returns () => void

    A cleanup function which removes all registered listeners.

      • (): void
      • A function to get notified when a stream is no longer readable, writable or has experienced an error or a premature close event.

        const { finished } = require('stream');

        const rs = fs.createReadStream('archive.tar');

        finished(rs, (err) => {
        if (err) {
        console.error('Stream failed.', err);
        } else {
        console.log('Stream is done reading.');
        }
        });

        rs.resume(); // Drain the stream.

        Especially useful in error handling scenarios where a stream is destroyed prematurely (like an aborted HTTP request), and will not emit 'end'or 'finish'.

        The finished API provides promise version:

        const { finished } = require('stream/promises');

        const rs = fs.createReadStream('archive.tar');

        async function run() {
        await finished(rs);
        console.log('Stream is done reading.');
        }

        run().catch(console.error);
        rs.resume(); // Drain the stream.

        stream.finished() leaves dangling event listeners (in particular'error', 'end', 'finish' and 'close') after callback has been invoked. The reason for this is so that unexpected 'error' events (due to incorrect stream implementations) do not cause unexpected crashes. If this is unwanted behavior then the returned cleanup function needs to be invoked in the callback:

        const cleanup = finished(rs, (err) => {
        cleanup();
        // ...
        });
        since

        v10.0.0

        Returns void

        A cleanup function which removes all registered listeners.

  • Parameters

    Returns () => void

      • (): void
      • Returns void

Generated using TypeDoc