summaryrefslogtreecommitdiff
path: root/deps/npm/node_modules/mississippi/readme.md
diff options
context:
space:
mode:
Diffstat (limited to 'deps/npm/node_modules/mississippi/readme.md')
-rw-r--r--deps/npm/node_modules/mississippi/readme.md70
1 files changed, 56 insertions, 14 deletions
diff --git a/deps/npm/node_modules/mississippi/readme.md b/deps/npm/node_modules/mississippi/readme.md
index 9013bb0dc5..569803865c 100644
--- a/deps/npm/node_modules/mississippi/readme.md
+++ b/deps/npm/node_modules/mississippi/readme.md
@@ -21,6 +21,7 @@ var miss = require('mississippi')
- [to](#to)
- [concat](#concat)
- [finished](#finished)
+- [parallel](#parallel)
### pipe
@@ -28,13 +29,13 @@ var miss = require('mississippi')
Pipes streams together and destroys all of them if one of them closes. Calls `cb` with `(error)` if there was an error in any of the streams.
-When using standard `source.pipe(destination)` the source will _not_ be destroyed if the destination emits close or error. You are also not able to provide a callback to tell when then pipe has finished.
+When using standard `source.pipe(destination)` the source will _not_ be destroyed if the destination emits close or error. You are also not able to provide a callback to tell when the pipe has finished.
`miss.pipe` does these two things for you, ensuring you handle stream errors 100% of the time (unhandled errors are probably the most common bug in most node streams code)
#### original module
-`miss.pipe` is provided by [`require('pump')`](https://npmjs.org/pump)
+`miss.pipe` is provided by [`require('pump')`](https://www.npmjs.com/package/pump)
#### example
@@ -56,13 +57,13 @@ miss.pipe(read, write, function (err) {
##### `miss.each(stream, each, [done])`
-Iterate the data in `stream` one chunk at a time. Your `each` function will be called with with `(data, next)` where data is a data chunk and next is a callback. Call `next` when you are ready to consume the next chunk.
+Iterate the data in `stream` one chunk at a time. Your `each` function will be called with `(data, next)` where data is a data chunk and next is a callback. Call `next` when you are ready to consume the next chunk.
Optionally you can call `next` with an error to destroy the stream. You can also pass the optional third argument, `done`, which is a function that will be called with `(err)` when the stream ends. The `err` argument will be populated with an error if the stream emitted an error.
#### original module
-`miss.each` is provided by [`require('stream-each')`](https://npmjs.org/stream-each)
+`miss.each` is provided by [`require('stream-each')`](https://www.npmjs.com/package/stream-each)
#### example
@@ -97,7 +98,7 @@ If any of the streams in the pipeline emits an error or gets destroyed, or you d
#### original module
-`miss.pipeline` is provided by [`require('pumpify')`](https://npmjs.org/pumpify)
+`miss.pipeline` is provided by [`require('pumpify')`](https://www.npmjs.com/package/pumpify)
#### example
@@ -136,7 +137,7 @@ You can either choose to supply the writable and the readable at the time you cr
#### original module
-`miss.duplex` is provided by [`require('duplexify')`](https://npmjs.org/duplexify)
+`miss.duplex` is provided by [`require('duplexify')`](https://www.npmjs.com/package/duplexify)
#### example
@@ -165,7 +166,7 @@ The `flushFunction`, with signature `(cb)`, is called just before the stream is
#### original module
-`miss.through` is provided by [`require('through2')`](https://npmjs.org/through2)
+`miss.through` is provided by [`require('through2')`](https://www.npmjs.com/package/through2)
#### example
@@ -178,10 +179,10 @@ var write = fs.createWriteStream('./AWESOMECASE.TXT')
// Leaving out the options object
var uppercaser = miss.through(
function (chunk, enc, cb) {
- cb(chunk.toString().toUpperCase())
+ cb(null, chunk.toString().toUpperCase())
},
function (cb) {
- cb('ONE LAST BIT OF UPPERCASE')
+ cb(null, 'ONE LAST BIT OF UPPERCASE')
}
)
@@ -206,7 +207,7 @@ Returns a readable stream that calls `read(size, next)` when data is requested f
#### original module
-`miss.from` is provided by [`require('from2')`](https://npmjs.org/from2)
+`miss.from` is provided by [`require('from2')`](https://www.npmjs.com/package/from2)
#### example
@@ -252,7 +253,7 @@ Returns a writable stream that calls `write(data, enc, cb)` when data is written
#### original module
-`miss.to` is provided by [`require('flush-write-stream')`](https://npmjs.org/flush-write-stream)
+`miss.to` is provided by [`require('flush-write-stream')`](https://www.npmjs.com/package/flush-write-stream)
#### example
@@ -294,13 +295,13 @@ finished
Returns a writable stream that concatenates all data written to the stream and calls a callback with the single result.
-Calling `miss.concat(cb)` returns a writable stream. `cb` is called when the writable stream is finished, e.g. when all data is done being written to it. `cb` is called with a single argument, `(data)`, which will containe the result of concatenating all the data written to the stream.
+Calling `miss.concat(cb)` returns a writable stream. `cb` is called when the writable stream is finished, e.g. when all data is done being written to it. `cb` is called with a single argument, `(data)`, which will contain the result of concatenating all the data written to the stream.
Note that `miss.concat` will not handle stream errors for you. To handle errors, use `miss.pipe` or handle the `error` event manually.
#### original module
-`miss.concat` is provided by [`require('concat-stream')`](https://npmjs.org/concat-stream)
+`miss.concat` is provided by [`require('concat-stream')`](https://www.npmjs.com/package/concat-stream)
#### example
@@ -335,7 +336,7 @@ This function is useful for simplifying stream handling code as it lets you hand
#### original module
-`miss.finished` is provided by [`require('end-of-stream')`](https://npmjs.org/end-of-stream)
+`miss.finished` is provided by [`require('end-of-stream')`](https://www.npmjs.com/package/end-of-stream)
#### example
@@ -350,3 +351,44 @@ miss.finished(copyDest, function(err) {
console.log('write success')
})
```
+
+### parallel
+
+#####`miss.parallel(concurrency, each)`
+
+This works like `through` except you can process items in parallel, while still preserving the original input order.
+
+This is handy if you wanna take advantage of node's async I/O and process streams of items in batches. With this module you can build your very own streaming parallel job queue.
+
+Note that `miss.parallel` preserves input ordering, if you don't need that then you can use [through2-concurrent](https://github.com/almost/through2-concurrent) instead, which is very similar to this otherwise.
+
+#### original module
+
+`miss.parallel` is provided by [`require('parallel-transform')`](https://npmjs.org/parallel-transform)
+
+#### example
+
+This example fetches the GET HTTP headers for a stream of input URLs 5 at a time in parallel.
+
+```js
+function getResponse (item, cb) {
+ var r = request(item.url)
+ r.on('error', function (err) {
+ cb(err)
+ })
+ r.on('response', function (re) {
+ cb(null, {url: item.url, date: new Date(), status: re.statusCode, headers: re.headers})
+ r.abort()
+ })
+}
+
+miss.pipe(
+ fs.createReadStream('./urls.txt'), // one url per line
+ split(),
+ miss.parallel(5, getResponse),
+ miss.through(function (row, enc, next) {
+ console.log(JSON.stringify(row))
+ next()
+ })
+)
+```