Welcome to mirror list, hosted at ThFree Co, Russian Federation.

github.com/npm/cli.git - Unnamed repository; edit this file 'description' to name the repository.
summaryrefslogtreecommitdiff
diff options
context:
space:
mode:
authorisaacs <i@izs.me>2020-07-23 20:58:04 +0300
committerisaacs <i@izs.me>2020-07-29 21:53:42 +0300
commitad5e07d8bd86d1dbe2b03dc142f8c8d6f4828ffe (patch)
tree97b66f97d77f35774f10a5e3e9957b1897d150bb /node_modules/minipass
parenta16994cfdd2f255016f3d8ee60d03473d80eabd8 (diff)
Full dependency reboot
Reinstall everything from a clean node_modules and package-lock.json state. Re-generate list of bundleDependencies and node_modules/.gitignore with a script that does the right thing based on actual dependency state.
Diffstat (limited to 'node_modules/minipass')
-rw-r--r--node_modules/minipass/README.md85
-rw-r--r--node_modules/minipass/index.js80
-rw-r--r--node_modules/minipass/package.json49
3 files changed, 91 insertions, 123 deletions
diff --git a/node_modules/minipass/README.md b/node_modules/minipass/README.md
index 1a6ff7f5d..c989beea0 100644
--- a/node_modules/minipass/README.md
+++ b/node_modules/minipass/README.md
@@ -7,32 +7,32 @@ stream](https://nodejs.org/api/stream.html#stream_class_stream_passthrough)
fast](https://docs.google.com/spreadsheets/d/1oObKSrVwLX_7Ut4Z6g3fZW-AX1j1-k6w-cDsrkaSbHM/edit#gid=0)
for objects, strings, and buffers.
-Supports pipe()ing (including multi-pipe() and backpressure transmission),
-buffering data until either a `data` event handler or `pipe()` is added (so
-you don't lose the first chunk), and most other cases where PassThrough is
-a good idea.
+Supports pipe()ing (including multi-pipe() and backpressure
+transmission), buffering data until either a `data` event handler or
+`pipe()` is added (so you don't lose the first chunk), and most other
+cases where PassThrough is a good idea.
-There is a `read()` method, but it's much more efficient to consume data
-from this stream via `'data'` events or by calling `pipe()` into some other
-stream. Calling `read()` requires the buffer to be flattened in some
-cases, which requires copying memory.
+There is a `read()` method, but it's much more efficient to consume
+data from this stream via `'data'` events or by calling `pipe()` into
+some other stream. Calling `read()` requires the buffer to be
+flattened in some cases, which requires copying memory.
-There is also no `unpipe()` method. Once you start piping, there is no
-stopping it!
+There is also no `unpipe()` method. Once you start piping, there is
+no stopping it!
-If you set `objectMode: true` in the options, then whatever is written will
-be emitted. Otherwise, it'll do a minimal amount of Buffer copying to
-ensure proper Streams semantics when `read(n)` is called.
+If you set `objectMode: true` in the options, then whatever is written
+will be emitted. Otherwise, it'll do a minimal amount of Buffer
+copying to ensure proper Streams semantics when `read(n)` is called.
`objectMode` can also be set by doing `stream.objectMode = true`, or by
writing any non-string/non-buffer data. `objectMode` cannot be set to
false once it is set.
-This is not a `through` or `through2` stream. It doesn't transform the
-data, it just passes it right through. If you want to transform the data,
-extend the class, and override the `write()` method. Once you're done
-transforming the data however you want, call `super.write()` with the
-transform output.
+This is not a `through` or `through2` stream. It doesn't transform
+the data, it just passes it right through. If you want to transform
+the data, extend the class, and override the `write()` method. Once
+you're done transforming the data however you want, call
+`super.write()` with the transform output.
For some examples of streams that extend Minipass in various ways, check
out:
@@ -46,14 +46,6 @@ out:
- [tap](http://npm.im/tap)
- [tap-parser](http://npm.im/tap)
- [treport](http://npm.im/tap)
-- [minipass-fetch](http://npm.im/minipass-fetch)
-- [pacote](http://npm.im/pacote)
-- [make-fetch-happen](http://npm.im/make-fetch-happen)
-- [cacache](http://npm.im/cacache)
-- [ssri](http://npm.im/ssri)
-- [npm-registry-fetch](http://npm.im/npm-registry-fetch)
-- [minipass-json-stream](http://npm.im/minipass-json-stream)
-- [minipass-sized](http://npm.im/minipass-sized)
## Differences from Node.js Streams
@@ -231,7 +223,7 @@ src.write('foo')
const tee = new Minipass()
tee.pipe(dest1)
tee.pipe(dest2)
-src.pipe(tee) // tee gets 'foo', pipes to both locations
+stream.pipe(tee) // tee gets 'foo', pipes to both locations
```
The same caveat applies to `on('data')` event listeners. The first one
@@ -260,8 +252,7 @@ src.pipe(tee)
## USAGE
-It's a stream! Use it like a stream and it'll most likely do what you
-want.
+It's a stream! Use it like a stream and it'll most likely do what you want.
```js
const Minipass = require('minipass')
@@ -289,30 +280,31 @@ streams.
* `write(chunk, [encoding], [callback])` - Put data in. (Note that, in the
base Minipass class, the same data will come out.) Returns `false` if
- the stream will buffer the next write, or true if it's still in "flowing"
- mode.
+ the stream will buffer the next write, or true if it's still in
+ "flowing" mode.
* `end([chunk, [encoding]], [callback])` - Signal that you have no more
data to write. This will queue an `end` event to be fired when all the
data has been consumed.
-* `setEncoding(encoding)` - Set the encoding for data coming of the stream.
- This can only be done once.
+* `setEncoding(encoding)` - Set the encoding for data coming of the
+ stream. This can only be done once.
* `pause()` - No more data for a while, please. This also prevents `end`
from being emitted for empty streams until the stream is resumed.
-* `resume()` - Resume the stream. If there's data in the buffer, it is all
- discarded. Any buffered events are immediately emitted.
+* `resume()` - Resume the stream. If there's data in the buffer, it is
+ all discarded. Any buffered events are immediately emitted.
* `pipe(dest)` - Send all output to the stream provided. There is no way
to unpipe. When data is emitted, it is immediately written to any and
all pipe destinations.
-* `on(ev, fn)`, `emit(ev, fn)` - Minipass streams are EventEmitters. Some
- events are given special treatment, however. (See below under "events".)
+* `on(ev, fn)`, `emit(ev, fn)` - Minipass streams are EventEmitters.
+ Some events are given special treatment, however. (See below under
+ "events".)
* `promise()` - Returns a Promise that resolves when the stream emits
`end`, or rejects if the stream emits `error`.
* `collect()` - Return a Promise that resolves on `end` with an array
- containing each chunk of data that was emitted, or rejects if the stream
- emits `error`. Note that this consumes the stream data.
-* `concat()` - Same as `collect()`, but concatenates the data into a single
- Buffer object. Will reject the returned promise if the stream is in
- objectMode, or if it goes into objectMode by the end of the data.
+ containing each chunk of data that was emitted, or rejects if the
+ stream emits `error`. Note that this consumes the stream data.
+* `concat()` - Same as `collect()`, but concatenates the data into a
+ single Buffer object. Will reject the returned promise if the stream is
+ in objectMode, or if it goes into objectMode by the end of the data.
* `read(n)` - Consume `n` bytes of data out of the buffer. If `n` is not
provided, then consume all of it. If `n` bytes are not available, then
it returns null. **Note** consuming streams in this way is less
@@ -429,8 +421,8 @@ mp.concat().then(onebigchunk => {
### iteration
-You can iterate over streams synchronously or asynchronously in platforms
-that support it.
+You can iterate over streams synchronously or asynchronously in
+platforms that support it.
Synchronous iteration will end when the currently available data is
consumed, even if the `end` event has not been reached. In string and
@@ -438,8 +430,9 @@ buffer mode, the data is concatenated, so unless multiple writes are
occurring in the same tick as the `read()`, sync iteration loops will
generally only have a single iteration.
-To consume chunks in this way exactly as they have been written, with no
-flattening, create the stream with the `{ objectMode: true }` option.
+To consume chunks in this way exactly as they have been written, with
+no flattening, create the stream with the `{ objectMode: true }`
+option.
```js
const mp = new Minipass({ objectMode: true })
diff --git a/node_modules/minipass/index.js b/node_modules/minipass/index.js
index 56cbd665d..c072352d4 100644
--- a/node_modules/minipass/index.js
+++ b/node_modules/minipass/index.js
@@ -1,6 +1,5 @@
'use strict'
const EE = require('events')
-const Stream = require('stream')
const Yallist = require('yallist')
const SD = require('string_decoder').StringDecoder
@@ -30,6 +29,12 @@ const ASYNCITERATOR = doIter && Symbol.asyncIterator
const ITERATOR = doIter && Symbol.iterator
|| Symbol('iterator not implemented')
+// Buffer in node 4.x < 4.5.0 doesn't have working Buffer.from
+// or Buffer.alloc, and Buffer in node 10 deprecated the ctor.
+// .M, this is fine .\^/M..
+const B = Buffer.alloc ? Buffer
+ : /* istanbul ignore next */ require('safe-buffer').Buffer
+
// events that mean 'the stream is over'
// these are treated specially, and re-emitted
// if they are listened for after emitting.
@@ -44,9 +49,9 @@ const isArrayBuffer = b => b instanceof ArrayBuffer ||
b.constructor.name === 'ArrayBuffer' &&
b.byteLength >= 0
-const isArrayBufferView = b => !Buffer.isBuffer(b) && ArrayBuffer.isView(b)
+const isArrayBufferView = b => !B.isBuffer(b) && ArrayBuffer.isView(b)
-module.exports = class Minipass extends Stream {
+module.exports = class Minipass extends EE {
constructor (options) {
super()
this[FLOWING] = false
@@ -97,7 +102,7 @@ module.exports = class Minipass extends Stream {
}
get objectMode () { return this[OBJECTMODE] }
- set objectMode (om) { this[OBJECTMODE] = this[OBJECTMODE] || !!om }
+ set objectMode (ॐ ) { this[OBJECTMODE] = this[OBJECTMODE] || !!ॐ }
write (chunk, encoding, cb) {
if (this[EOF])
@@ -121,11 +126,11 @@ module.exports = class Minipass extends Stream {
// at some point in the future, we may want to do the opposite!
// leave strings and buffers as-is
// anything else switches us into object mode
- if (!this[OBJECTMODE] && !Buffer.isBuffer(chunk)) {
+ if (!this[OBJECTMODE] && !B.isBuffer(chunk)) {
if (isArrayBufferView(chunk))
- chunk = Buffer.from(chunk.buffer, chunk.byteOffset, chunk.byteLength)
+ chunk = B.from(chunk.buffer, chunk.byteOffset, chunk.byteLength)
else if (isArrayBuffer(chunk))
- chunk = Buffer.from(chunk)
+ chunk = B.from(chunk)
else if (typeof chunk !== 'string')
// use the setter so we throw if we have encoding set
this.objectMode = true
@@ -134,11 +139,12 @@ module.exports = class Minipass extends Stream {
// this ensures at this point that the chunk is a buffer or string
// don't buffer it up or send it to the decoder
if (!this.objectMode && !chunk.length) {
+ const ret = this.flowing
if (this[BUFFERLENGTH] !== 0)
this.emit('readable')
if (cb)
cb()
- return this.flowing
+ return ret
}
// fast-path writing strings of same encoding to a stream with
@@ -146,30 +152,22 @@ module.exports = class Minipass extends Stream {
if (typeof chunk === 'string' && !this[OBJECTMODE] &&
// unless it is a string already ready for us to use
!(encoding === this[ENCODING] && !this[DECODER].lastNeed)) {
- chunk = Buffer.from(chunk, encoding)
+ chunk = B.from(chunk, encoding)
}
- if (Buffer.isBuffer(chunk) && this[ENCODING])
+ if (B.isBuffer(chunk) && this[ENCODING])
chunk = this[DECODER].write(chunk)
- if (this.flowing) {
- // if we somehow have something in the buffer, but we think we're
- // flowing, then we need to flush all that out first, or we get
- // chunks coming in out of order. Can't emit 'drain' here though,
- // because we're mid-write, so that'd be bad.
+ try {
+ return this.flowing
+ ? (this.emit('data', chunk), this.flowing)
+ : (this[BUFFERPUSH](chunk), false)
+ } finally {
if (this[BUFFERLENGTH] !== 0)
- this[FLUSH](true)
- this.emit('data', chunk)
- } else
- this[BUFFERPUSH](chunk)
-
- if (this[BUFFERLENGTH] !== 0)
- this.emit('readable')
-
- if (cb)
- cb()
-
- return this.flowing
+ this.emit('readable')
+ if (cb)
+ cb()
+ }
}
read (n) {
@@ -190,7 +188,7 @@ module.exports = class Minipass extends Stream {
])
else
this.buffer = new Yallist([
- Buffer.concat(Array.from(this.buffer), this[BUFFERLENGTH])
+ B.concat(Array.from(this.buffer), this[BUFFERLENGTH])
])
}
@@ -293,10 +291,10 @@ module.exports = class Minipass extends Stream {
return this.buffer.shift()
}
- [FLUSH] (noDrain) {
+ [FLUSH] () {
do {} while (this[FLUSHCHUNK](this[BUFFERSHIFT]()))
- if (!noDrain && !this.buffer.length && !this[EOF])
+ if (!this.buffer.length && !this[EOF])
this.emit('drain')
}
@@ -425,17 +423,12 @@ module.exports = class Minipass extends Stream {
// const all = await stream.collect()
collect () {
const buf = []
- if (!this[OBJECTMODE])
- buf.dataLength = 0
- // set the promise first, in case an error is raised
- // by triggering the flow here.
- const p = this.promise()
+ buf.dataLength = 0
this.on('data', c => {
buf.push(c)
- if (!this[OBJECTMODE])
- buf.dataLength += c.length
+ buf.dataLength += c.length
})
- return p.then(() => buf)
+ return this.promise().then(() => buf)
}
// const data = await stream.concat()
@@ -445,7 +438,7 @@ module.exports = class Minipass extends Stream {
: this.collect().then(buf =>
this[OBJECTMODE]
? Promise.reject(new Error('cannot concat in objectMode'))
- : this[ENCODING] ? buf.join('') : Buffer.concat(buf, buf.dataLength))
+ : this[ENCODING] ? buf.join('') : B.concat(buf, buf.dataLength))
}
// stream.promise().then(() => done, er => emitted error)
@@ -536,10 +529,9 @@ module.exports = class Minipass extends Stream {
}
static isStream (s) {
- return !!s && (s instanceof Minipass || s instanceof Stream ||
- s instanceof EE && (
- typeof s.pipe === 'function' || // readable
- (typeof s.write === 'function' && typeof s.end === 'function') // writable
- ))
+ return !!s && (s instanceof Minipass || s instanceof EE && (
+ typeof s.pipe === 'function' || // readable
+ (typeof s.write === 'function' && typeof s.end === 'function') // writable
+ ))
}
}
diff --git a/node_modules/minipass/package.json b/node_modules/minipass/package.json
index 315022964..6765afe7d 100644
--- a/node_modules/minipass/package.json
+++ b/node_modules/minipass/package.json
@@ -1,44 +1,29 @@
{
- "_from": "minipass@latest",
- "_id": "minipass@3.1.3",
+ "_from": "minipass@^2.8.6",
+ "_id": "minipass@2.9.0",
"_inBundle": false,
- "_integrity": "sha512-Mgd2GdMVzY+x3IJ+oHnVM+KG3lA5c8tnabyJKmHSaG2kAGpudxuOf8ToDkhumF7UzME7DecbQE9uOZhNm7PuJg==",
+ "_integrity": "sha512-wxfUjg9WebH+CUDX/CdbRlh5SmfZiy/hpkxaRI16Y9W56Pa75sWgd/rvFilSgrauD9NyFymP/+JFV3KwzIsJeg==",
"_location": "/minipass",
"_phantomChildren": {},
"_requested": {
- "type": "tag",
+ "type": "range",
"registry": true,
- "raw": "minipass@latest",
+ "raw": "minipass@^2.8.6",
"name": "minipass",
"escapedName": "minipass",
- "rawSpec": "latest",
+ "rawSpec": "^2.8.6",
"saveSpec": null,
- "fetchSpec": "latest"
+ "fetchSpec": "^2.8.6"
},
"_requiredBy": [
- "#USER",
- "/",
- "/cacache",
- "/cacache/tar",
"/fs-minipass",
- "/libnpmaccess",
- "/make-fetch-happen",
- "/minipass-collect",
- "/minipass-fetch",
- "/minipass-flush",
- "/minipass-json-stream",
- "/minipass-pipeline",
- "/minipass-sized",
"/minizlib",
- "/npm-registry-fetch",
- "/ssri",
- "/tap-parser",
- "/tar"
+ "/node-gyp/tar"
],
- "_resolved": "https://registry.npmjs.org/minipass/-/minipass-3.1.3.tgz",
- "_shasum": "7d42ff1f39635482e15f9cdb53184deebd5815fd",
- "_spec": "minipass@latest",
- "_where": "/Users/isaacs/dev/npm/cli",
+ "_resolved": "https://registry.npmjs.org/minipass/-/minipass-2.9.0.tgz",
+ "_shasum": "e713762e7d3e32fed803115cf93e04bca9fcc9a6",
+ "_spec": "minipass@^2.8.6",
+ "_where": "/Users/isaacs/dev/npm/cli/node_modules/node-gyp/node_modules/tar",
"author": {
"name": "Isaac Z. Schlueter",
"email": "i@izs.me",
@@ -49,7 +34,8 @@
},
"bundleDependencies": false,
"dependencies": {
- "yallist": "^4.0.0"
+ "safe-buffer": "^5.1.2",
+ "yallist": "^3.0.0"
},
"deprecated": false,
"description": "minimal implementation of a PassThrough stream",
@@ -58,9 +44,6 @@
"tap": "^14.6.5",
"through2": "^2.0.3"
},
- "engines": {
- "node": ">=8"
- },
"files": [
"index.js"
],
@@ -78,12 +61,12 @@
},
"scripts": {
"postpublish": "git push origin --follow-tags",
- "postversion": "npm publish --tag=next",
+ "postversion": "npm publish",
"preversion": "npm test",
"test": "tap"
},
"tap": {
"check-coverage": true
},
- "version": "3.1.3"
+ "version": "2.9.0"
}