Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/nodejs/node.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorAntoine du Hamel <duhamelantoine1995@gmail.com>2022-04-20 11:23:41 +0300
committerGitHub <noreply@github.com>2022-04-20 11:23:41 +0300
commit6afd3fcf653ed92063bafefa83661a076d241585 (patch)
tree9e1a22ab2f3d3cf1f14ec2348ca0a318fef16898 /doc/api/stream.md
parent1fe5d56403a3725eac5e67c7a08830ce5ee0e2f5 (diff)
doc: add `node:` prefix for all core modules
Some core modules can be loaded with or without the `node:` prefix. Using the prefix disambiguates which specifiers refer to core modules. This commit updates the docs to use the prefix everywhere a core module is referenced. PR-URL: https://github.com/nodejs/node/pull/42752 Fixes: https://github.com/nodejs/node/issues/38343 Reviewed-By: Matteo Collina <matteo.collina@gmail.com> Reviewed-By: Mestery <mestery@protonmail.com> Reviewed-By: Darshan Sen <raisinten@gmail.com> Reviewed-By: Akhil Marsonya <akhil.marsonya27@gmail.com> Reviewed-By: Rafael Gonzaga <rafael.nunu@hotmail.com> Reviewed-By: Mohammed Keyvanzadeh <mohammadkeyvanzade94@gmail.com> Reviewed-By: Geoffrey Booth <webadmin@geoffreybooth.com> Reviewed-By: Stephen Belanger <admin@stephenbelanger.com>
Diffstat (limited to 'doc/api/stream.md')
-rw-r--r--doc/api/stream.md207
1 files changed, 104 insertions, 103 deletions
diff --git a/doc/api/stream.md b/doc/api/stream.md
index 9ba920f7795..cd8e7e6e7f5 100644
--- a/doc/api/stream.md
+++ b/doc/api/stream.md
@@ -7,7 +7,7 @@
<!-- source_link=lib/stream.js -->
A stream is an abstract interface for working with streaming data in Node.js.
-The `stream` module provides an API for implementing the stream interface.
+The `node:stream` module provides an API for implementing the stream interface.
There are many stream objects provided by Node.js. For instance, a
[request to an HTTP server][http-incoming-message] and [`process.stdout`][]
@@ -16,14 +16,14 @@ are both stream instances.
Streams can be readable, writable, or both. All streams are instances of
[`EventEmitter`][].
-To access the `stream` module:
+To access the `node:stream` module:
```js
-const stream = require('stream');
+const stream = require('node:stream');
```
-The `stream` module is useful for creating new types of stream instances. It is
-usually not necessary to use the `stream` module to consume streams.
+The `node:stream` module is useful for creating new types of stream instances.
+It is usually not necessary to use the `node:stream` module to consume streams.
## Organization of this document
@@ -56,8 +56,8 @@ added: v15.0.0
The `stream/promises` API provides an alternative set of asynchronous utility
functions for streams that return `Promise` objects rather than using
-callbacks. The API is accessible via `require('stream/promises')`
-or `require('stream').promises`.
+callbacks. The API is accessible via `require('node:stream/promises')`
+or `require('node:stream').promises`.
### Object mode
@@ -134,7 +134,7 @@ manner. The following is an example of using streams in a Node.js application
that implements an HTTP server:
```js
-const http = require('http');
+const http = require('node:http');
const server = http.createServer((req, res) => {
// `req` is an http.IncomingMessage, which is a readable stream.
@@ -190,7 +190,7 @@ various ways to communicate the current state of the stream.
Applications that are either writing data to or consuming data from a stream
are not required to implement the stream interfaces directly and will generally
-have no reason to call `require('stream')`.
+have no reason to call `require('node:stream')`.
Developers wishing to implement new types of streams should refer to the
section [API for stream implementers][].
@@ -422,7 +422,7 @@ Use `end()` instead of destroy if data should flush before close, or wait for
the `'drain'` event before destroying the stream.
```cjs
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myStream = new Writable();
@@ -432,7 +432,7 @@ myStream.on('error', (fooErr) => console.error(fooErr.message)); // foo error
```
```cjs
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myStream = new Writable();
@@ -441,7 +441,7 @@ myStream.on('error', function wontHappen() {});
```
```cjs
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myStream = new Writable();
myStream.destroy();
@@ -477,7 +477,7 @@ added: v8.0.0
Is `true` after [`writable.destroy()`][writable-destroy] has been called.
```cjs
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myStream = new Writable();
@@ -523,7 +523,7 @@ Calling the [`stream.write()`][stream-write] method after calling
```js
// Write 'hello, ' and then end with 'world!'.
-const fs = require('fs');
+const fs = require('node:fs');
const file = fs.createWriteStream('example.txt');
file.write('hello, ');
file.end('world!');
@@ -871,7 +871,7 @@ data. While in this state, attaching a listener for the `'data'` event
will not switch `readable.readableFlowing` to `true`.
```js
-const { PassThrough, Writable } = require('stream');
+const { PassThrough, Writable } = require('node:stream');
const pass = new PassThrough();
const writable = new Writable();
@@ -1041,7 +1041,7 @@ event. This is also true if there never was any data to be read. For instance,
in the following example, `foo.txt` is an empty file:
```js
-const fs = require('fs');
+const fs = require('node:fs');
const rr = fs.createReadStream('foo.txt');
rr.on('readable', () => {
console.log(`readable: ${rr.read()}`);
@@ -1200,7 +1200,7 @@ The following example pipes all of the data from the `readable` into a file
named `file.txt`:
```js
-const fs = require('fs');
+const fs = require('node:fs');
const readable = getReadableStreamSomehow();
const writable = fs.createWriteStream('file.txt');
// All the data from readable goes into 'file.txt'.
@@ -1214,7 +1214,7 @@ The `readable.pipe()` method returns a reference to the _destination_ stream
making it possible to set up chains of piped streams:
```js
-const fs = require('fs');
+const fs = require('node:fs');
const r = fs.createReadStream('file.txt');
const z = zlib.createGzip();
const w = fs.createWriteStream('file.txt.gz');
@@ -1518,7 +1518,7 @@ If the `destination` is specified, but no pipe is set up for it, then
the method does nothing.
```js
-const fs = require('fs');
+const fs = require('node:fs');
const readable = getReadableStreamSomehow();
const writable = fs.createWriteStream('file.txt');
// All the data from readable goes into 'file.txt',
@@ -1570,7 +1570,7 @@ section for more information.
// Pull off a header delimited by \n\n.
// Use unshift() if we get too much.
// Call the callback with (error, header, stream).
-const { StringDecoder } = require('string_decoder');
+const { StringDecoder } = require('node:string_decoder');
function parseHeader(stream, callback) {
stream.on('error', callback);
stream.on('readable', onReadable);
@@ -1620,8 +1620,9 @@ added: v0.9.4
* `stream` {Stream} An "old style" readable stream
* Returns: {this}
-Prior to Node.js 0.10, streams did not implement the entire `stream` module API
-as it is currently defined. (See [Compatibility][] for more information.)
+Prior to Node.js 0.10, streams did not implement the entire `node:stream`
+module API as it is currently defined. (See [Compatibility][] for more
+information.)
When using an older Node.js library that emits [`'data'`][] events and has a
[`stream.pause()`][stream-pause] method that is advisory only, the
@@ -1634,7 +1635,7 @@ libraries.
```js
const { OldReader } = require('./old-api-module.js');
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
const oreader = new OldReader();
const myReader = new Readable().wrap(oreader);
@@ -1656,7 +1657,7 @@ changes:
* Returns: {AsyncIterator} to fully consume the stream.
```js
-const fs = require('fs');
+const fs = require('node:fs');
async function print(readable) {
readable.setEncoding('utf8');
@@ -1697,7 +1698,7 @@ destruction of the stream if the `for await...of` loop is exited by `return`,
emitted an error during iteration.
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
async function printIterator(readable) {
for await (const chunk of readable.iterator({ destroyOnReturn: false })) {
@@ -1759,8 +1760,8 @@ for every chunk in the stream. If the `fn` function returns a promise - that
promise will be `await`ed before being passed to the result stream.
```mjs
-import { Readable } from 'stream';
-import { Resolver } from 'dns/promises';
+import { Readable } from 'node:stream';
+import { Resolver } from 'node:dns/promises';
// With a synchronous mapper.
for await (const chunk of Readable.from([1, 2, 3, 4]).map((x) => x * 2)) {
@@ -1806,8 +1807,8 @@ passed to the result stream. If the `fn` function returns a promise - that
promise will be `await`ed.
```mjs
-import { Readable } from 'stream';
-import { Resolver } from 'dns/promises';
+import { Readable } from 'node:stream';
+import { Resolver } from 'node:dns/promises';
// With a synchronous predicate.
for await (const chunk of Readable.from([1, 2, 3, 4]).filter((x) => x > 2)) {
@@ -1864,8 +1865,8 @@ uses the [`readable`][] event in the underlying machinary and can limit the
number of concurrent `fn` calls.
```mjs
-import { Readable } from 'stream';
-import { Resolver } from 'dns/promises';
+import { Readable } from 'node:stream';
+import { Resolver } from 'node:dns/promises';
// With a synchronous predicate.
for await (const chunk of Readable.from([1, 2, 3, 4]).filter((x) => x > 2)) {
@@ -1909,8 +1910,8 @@ streams. It's intended for interoperability and convenience, not as the primary
way to consume streams.
```mjs
-import { Readable } from 'stream';
-import { Resolver } from 'dns/promises';
+import { Readable } from 'node:stream';
+import { Resolver } from 'node:dns/promises';
await Readable.from([1, 2, 3, 4]).toArray(); // [1, 2, 3, 4]
@@ -1955,8 +1956,8 @@ calls on the chunks return a truthy value, the promise is fulfilled with
`false`.
```mjs
-import { Readable } from 'stream';
-import { stat } from 'fs/promises';
+import { Readable } from 'node:stream';
+import { stat } from 'node:fs/promises';
// With a synchronous predicate.
await Readable.from([1, 2, 3, 4]).some((x) => x > 2); // true
@@ -2004,8 +2005,8 @@ fulfilled with value for which `fn` returned a truthy value. If all of the
`undefined`.
```mjs
-import { Readable } from 'stream';
-import { stat } from 'fs/promises';
+import { Readable } from 'node:stream';
+import { stat } from 'node:fs/promises';
// With a synchronous predicate.
await Readable.from([1, 2, 3, 4]).find((x) => x > 2); // 3
@@ -2053,8 +2054,8 @@ destroyed and the promise is fulfilled with `false`. If all of the `fn` calls
on the chunks return a truthy value, the promise is fulfilled with `true`.
```mjs
-import { Readable } from 'stream';
-import { stat } from 'fs/promises';
+import { Readable } from 'node:stream';
+import { stat } from 'node:fs/promises';
// With a synchronous predicate.
await Readable.from([1, 2, 3, 4]).every((x) => x > 2); // false
@@ -2103,8 +2104,8 @@ It is possible to return a stream or another iterable or async iterable from
stream.
```mjs
-import { Readable } from 'stream';
-import { createReadStream } from 'fs';
+import { Readable } from 'node:stream';
+import { createReadStream } from 'node:fs';
// With a synchronous mapper.
for await (const chunk of Readable.from([1, 2, 3, 4]).flatMap((x) => [x, x])) {
@@ -2140,7 +2141,7 @@ added: v17.5.0
This method returns a new stream with the first `limit` chunks dropped.
```mjs
-import { Readable } from 'stream';
+import { Readable } from 'node:stream';
await Readable.from([1, 2, 3, 4]).drop(2).toArray(); // [3, 4]
```
@@ -2162,7 +2163,7 @@ added: v17.5.0
This method returns a new stream with the first `limit` chunks.
```mjs
-import { Readable } from 'stream';
+import { Readable } from 'node:stream';
await Readable.from([1, 2, 3, 4]).take(2).toArray(); // [1, 2]
```
@@ -2185,7 +2186,7 @@ with a counter in the form `[index, chunk]`. The first index value is 0 and it
increases by 1 for each chunk produced.
```mjs
-import { Readable } from 'stream';
+import { Readable } from 'node:stream';
const pairs = await Readable.from(['a', 'b', 'c']).asIndexedPairs().toArray();
console.log(pairs); // [[0, 'a'], [1, 'b'], [2, 'c']]
@@ -2226,7 +2227,7 @@ initial value. If the stream is empty, the promise is rejected with a
`TypeError` with the `ERR_INVALID_ARGS` code property.
```mjs
-import { Readable } from 'stream';
+import { Readable } from 'node:stream';
const ten = await Readable.from([1, 2, 3, 4]).reduce((previous, data) => {
return previous + data;
@@ -2361,7 +2362,7 @@ A function to get notified when a stream is no longer readable, writable
or has experienced an error or a premature close event.
```js
-const { finished } = require('stream');
+const { finished } = require('node:stream');
const rs = fs.createReadStream('archive.tar');
@@ -2383,7 +2384,7 @@ or `'finish'`.
The `finished` API provides promise version:
```js
-const { finished } = require('stream/promises');
+const { finished } = require('node:stream/promises');
const rs = fs.createReadStream('archive.tar');
@@ -2451,9 +2452,9 @@ A module method to pipe between streams and generators forwarding errors and
properly cleaning up and provide a callback when the pipeline is complete.
```js
-const { pipeline } = require('stream');
-const fs = require('fs');
-const zlib = require('zlib');
+const { pipeline } = require('node:stream');
+const fs = require('node:fs');
+const zlib = require('node:zlib');
// Use the pipeline API to easily pipe a series of streams
// together and get notified when the pipeline is fully done.
@@ -2481,7 +2482,7 @@ receive an options argument as the last parameter with a
`AbortError`.
```js
-const { pipeline } = require('stream/promises');
+const { pipeline } = require('node:stream/promises');
async function run() {
await pipeline(
@@ -2499,7 +2500,7 @@ To use an `AbortSignal`, pass it inside an options object,
as the last argument:
```js
-const { pipeline } = require('stream/promises');
+const { pipeline } = require('node:stream/promises');
async function run() {
const ac = new AbortController();
@@ -2520,8 +2521,8 @@ run().catch(console.error); // AbortError
The `pipeline` API also supports async generators:
```js
-const { pipeline } = require('stream/promises');
-const fs = require('fs');
+const { pipeline } = require('node:stream/promises');
+const fs = require('node:fs');
async function run() {
await pipeline(
@@ -2545,8 +2546,8 @@ Especially in the case where the async generator is the source for the
pipeline (i.e. first argument) or the pipeline will never complete.
```js
-const { pipeline } = require('stream/promises');
-const fs = require('fs');
+const { pipeline } = require('node:stream/promises');
+const fs = require('node:fs');
async function run() {
await pipeline(
@@ -2579,9 +2580,9 @@ once it would destroy the socket without sending the expected response.
See the example below:
```js
-const fs = require('fs');
-const http = require('http');
-const { pipeline } = require('stream');
+const fs = require('node:fs');
+const http = require('node:http');
+const { pipeline } = require('node:stream');
const server = http.createServer((req, res) => {
const fileStream = fs.createReadStream('./fileNotExist.txt');
@@ -2621,7 +2622,7 @@ If passed a `Function` it must be a factory method taking a `source`
`Iterable`.
```mjs
-import { compose, Transform } from 'stream';
+import { compose, Transform } from 'node:stream';
const removeSpaces = new Transform({
transform(chunk, encoding, callback) {
@@ -2655,8 +2656,8 @@ functions into streams.
either `null` or `undefined`.
```mjs
-import { compose } from 'stream';
-import { finished } from 'stream/promises';
+import { compose } from 'node:stream';
+import { finished } from 'node:stream/promises';
// Convert AsyncIterable into readable Duplex.
const s1 = compose(async function*() {
@@ -2704,7 +2705,7 @@ added:
A utility method for creating readable streams out of iterators.
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
async function * generate() {
yield 'hello';
@@ -2898,7 +2899,7 @@ Calling `abort` on the `AbortController` corresponding to the passed
on the stream.
```js
-const fs = require('fs');
+const fs = require('node:fs');
const controller = new AbortController();
const read = addAbortSignal(
@@ -2937,7 +2938,7 @@ const stream = addAbortSignal(
<!--type=misc-->
-The `stream` module API has been designed to make it possible to easily
+The `node:stream` module API has been designed to make it possible to easily
implement streams using JavaScript's prototypal inheritance model.
First, a stream developer would declare a new JavaScript class that extends one
@@ -2948,7 +2949,7 @@ parent class constructor:
<!-- eslint-disable no-useless-constructor -->
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
class MyWritable extends Writable {
constructor({ highWaterMark, ...options }) {
@@ -2999,7 +3000,7 @@ inheritance. This can be accomplished by directly creating instances of the
objects and passing appropriate methods as constructor options.
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myWritable = new Writable({
construct(callback) {
@@ -3081,7 +3082,7 @@ changes:
<!-- eslint-disable no-useless-constructor -->
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
class MyWritable extends Writable {
constructor(options) {
@@ -3095,8 +3096,8 @@ class MyWritable extends Writable {
Or, when using pre-ES6 style constructors:
```js
-const { Writable } = require('stream');
-const util = require('util');
+const { Writable } = require('node:stream');
+const util = require('node:util');
function MyWritable(options) {
if (!(this instanceof MyWritable))
@@ -3109,7 +3110,7 @@ util.inherits(MyWritable, Writable);
Or, using the simplified constructor approach:
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myWritable = new Writable({
write(chunk, encoding, callback) {
@@ -3126,7 +3127,7 @@ Calling `abort` on the `AbortController` corresponding to the passed
on the writeable stream.
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const controller = new AbortController();
const myWritable = new Writable({
@@ -3161,8 +3162,8 @@ has returned, delaying any `_write()`, `_final()` and `_destroy()` calls until
initialize resources before the stream can be used.
```js
-const { Writable } = require('stream');
-const fs = require('fs');
+const { Writable } = require('node:stream');
+const fs = require('node:fs');
class WriteStream extends Writable {
constructor(filename) {
@@ -3318,7 +3319,7 @@ If a `Readable` stream pipes into a `Writable` stream when `Writable` emits an
error, the `Readable` stream will be unpiped.
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
const myWritable = new Writable({
write(chunk, encoding, callback) {
@@ -3339,7 +3340,7 @@ is not of any real particular usefulness, the example illustrates each of the
required elements of a custom [`Writable`][] stream instance:
```js
-const { Writable } = require('stream');
+const { Writable } = require('node:stream');
class MyWritable extends Writable {
_write(chunk, encoding, callback) {
@@ -3360,8 +3361,8 @@ characters encoding, such as UTF-8. The following example shows how to decode
multi-byte strings using `StringDecoder` and [`Writable`][].
```js
-const { Writable } = require('stream');
-const { StringDecoder } = require('string_decoder');
+const { Writable } = require('node:stream');
+const { StringDecoder } = require('node:string_decoder');
class StringWritable extends Writable {
constructor(options) {
@@ -3441,7 +3442,7 @@ changes:
<!-- eslint-disable no-useless-constructor -->
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
class MyReadable extends Readable {
constructor(options) {
@@ -3455,8 +3456,8 @@ class MyReadable extends Readable {
Or, when using pre-ES6 style constructors:
```js
-const { Readable } = require('stream');
-const util = require('util');
+const { Readable } = require('node:stream');
+const util = require('node:util');
function MyReadable(options) {
if (!(this instanceof MyReadable))
@@ -3469,7 +3470,7 @@ util.inherits(MyReadable, Readable);
Or, using the simplified constructor approach:
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
const myReadable = new Readable({
read(size) {
@@ -3483,7 +3484,7 @@ Calling `abort` on the `AbortController` corresponding to the passed
on the readable created.
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
const controller = new AbortController();
const read = new Readable({
read(size) {
@@ -3514,8 +3515,8 @@ called. This is useful to initialize state or asynchronously initialize
resources before the stream can be used.
```js
-const { Readable } = require('stream');
-const fs = require('fs');
+const { Readable } = require('node:stream');
+const fs = require('node:fs');
class ReadStream extends Readable {
constructor(filename) {
@@ -3687,7 +3688,7 @@ Throwing an `Error` from within [`readable._read()`][] or manually emitting an
`'error'` event results in undefined behavior.
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
const myReadable = new Readable({
read(size) {
@@ -3709,7 +3710,7 @@ The following is a basic example of a `Readable` stream that emits the numerals
from 1 to 1,000,000 in ascending order, and then ends.
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
class Counter extends Readable {
constructor(opt) {
@@ -3780,7 +3781,7 @@ changes:
<!-- eslint-disable no-useless-constructor -->
```js
-const { Duplex } = require('stream');
+const { Duplex } = require('node:stream');
class MyDuplex extends Duplex {
constructor(options) {
@@ -3793,8 +3794,8 @@ class MyDuplex extends Duplex {
Or, when using pre-ES6 style constructors:
```js
-const { Duplex } = require('stream');
-const util = require('util');
+const { Duplex } = require('node:stream');
+const util = require('node:util');
function MyDuplex(options) {
if (!(this instanceof MyDuplex))
@@ -3807,7 +3808,7 @@ util.inherits(MyDuplex, Duplex);
Or, using the simplified constructor approach:
```js
-const { Duplex } = require('stream');
+const { Duplex } = require('node:stream');
const myDuplex = new Duplex({
read(size) {
@@ -3822,8 +3823,8 @@ const myDuplex = new Duplex({
When using pipeline:
```js
-const { Transform, pipeline } = require('stream');
-const fs = require('fs');
+const { Transform, pipeline } = require('node:stream');
+const fs = require('node:fs');
pipeline(
fs.createReadStream('object.json')
@@ -3871,7 +3872,7 @@ incoming written data via the [`Writable`][] interface that is read back out
via the [`Readable`][] interface.
```js
-const { Duplex } = require('stream');
+const { Duplex } = require('node:stream');
const kSource = Symbol('source');
class MyDuplex extends Duplex {
@@ -3912,7 +3913,7 @@ that accepts JavaScript numbers that are converted to hexadecimal strings on
the `Readable` side.
```js
-const { Transform } = require('stream');
+const { Transform } = require('node:stream');
// All Transform streams are also Duplex Streams.
const myTransform = new Transform({
@@ -3977,7 +3978,7 @@ output on the `Readable` side is not consumed.
<!-- eslint-disable no-useless-constructor -->
```js
-const { Transform } = require('stream');
+const { Transform } = require('node:stream');
class MyTransform extends Transform {
constructor(options) {
@@ -3990,8 +3991,8 @@ class MyTransform extends Transform {
Or, when using pre-ES6 style constructors:
```js
-const { Transform } = require('stream');
-const util = require('util');
+const { Transform } = require('node:stream');
+const util = require('node:util');
function MyTransform(options) {
if (!(this instanceof MyTransform))
@@ -4004,7 +4005,7 @@ util.inherits(MyTransform, Transform);
Or, using the simplified constructor approach:
```js
-const { Transform } = require('stream');
+const { Transform } = require('node:stream');
const myTransform = new Transform({
transform(chunk, encoding, callback) {
@@ -4148,7 +4149,7 @@ A Node.js readable stream can be created from an asynchronous generator using
the `Readable.from()` utility method:
```js
-const { Readable } = require('stream');
+const { Readable } = require('node:stream');
const ac = new AbortController();
const signal = ac.signal;
@@ -4177,9 +4178,9 @@ handling of backpressure and errors. [`stream.pipeline()`][] abstracts away
the handling of backpressure and backpressure-related errors:
```js
-const fs = require('fs');
-const { pipeline } = require('stream');
-const { pipeline: pipelinePromise } = require('stream/promises');
+const fs = require('node:fs');
+const { pipeline } = require('node:stream');
+const { pipeline: pipelinePromise } = require('node:stream/promises');
const writable = fs.createWriteStream('./file');