rdrew
1440706ce3
|
6 years ago | |
---|---|---|
.. | ||
LICENSE.txt | 6 years ago | |
README.md | 6 years ago | |
package.json | 6 years ago | |
through2-concurrent.js | 6 years ago |
README.md
through2-concurrent
A simple way to create a Node.JS Transform stream which processes in parallel. You can limit the concurrency (default is 16) and order is not preserved (so chunks/objects can end up in a different order to the order they started in if the transform functions take different amounts of time).
Built using through2 and has the
same API with the addition of a maxConcurrency
option.
Non-objectMode
streams are supported for completeness but I'm not
sure they'd be useful for anything.
Written by Thomas Parslow (almostobsolete.net and tomparslow.co.uk) as part of Active Inbox (activeinboxhq.com).
Install
npm install --save through2-concurrent
Examples
Process lines from a CSV in paralel. The order the results end up in
the all
variable is not deterministic.
var through2Concurrent = require('through2-concurrent');
var all = [];
fs.createReadStream('data.csv')
.pipe(csv2())
.pipe(through2Concurrent.obj(
{maxConcurrency: 10},
function (chunk, enc, callback) {
var self = this;
someThingAsync(chunk, function (newChunk) {
self.push(newChunk);
callback();
});
}))
.on('data', function (data) {
all.push(data)
})
.on('end', function () {
doSomethingSpecial(all)
})
Contributing
Fixed or improved stuff? Great! Send me a pull request through GitHub or get in touch on Twitter @almostobsolete or email at tom@almostobsolete.net