Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How i can resume after error event in piped stream in nodejs?

Tags:

stream

node.js

After i emit error event in MyWritableStream, data transmission stops. What i need to do to resume data transfer?

var readable = fs.createReadStream('test.txt');
var writable = new MyWritableStream();

writable.on('error', function(error) {
    console.log('error', error);
    // How i can resume?
});

writable.on('finish', function(){
    console.log('finished');
})

readable.pipe(writable);
like image 232
avasin Avatar asked Oct 19 '22 23:10

avasin


1 Answers

I know this question is old, but you might wanna check out https://github.com/miraclx/xresilient

I built this for this exact same reason (works best with seekable streams).

You define a function that returns a readable stream, the library measures the number of bytes that have passed through until an error is met.

Once the readable stream encounters an error event, it recalls the defined function with the number of bytes read so you can index the stream source.

Example:

const fs = require('fs');
const xresilient = require('xresilient');

const readable = xresilient(({bytesRead}) => {
  return generateSeekableStreamSomehow({start: bytesRead});
}, {retries: 5});

const writable = fs.createWriteStream('file.test');

readable.pipe(writable);
  • File streams are indexable with the start option of the fs.createReadStream() function.
  • HTTP Requests are indexable with the Range HTTP Header.

Check it out. https://www.npmjs.com/package/xresilient

like image 102
Miraclx Avatar answered Oct 24 '22 09:10

Miraclx