Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to properly react to an error event in a Node.js stream pipe-line?

I was struggling with understanding how error handling works inside a Node.js stream pipeline and I ended up experimenting using a simple playground which made things crystal clear for me.

I am posting this as a self-answered question. Perhaps someone finds it helpful :)

like image 633
frenya Avatar asked Jan 06 '23 18:01

frenya


1 Answers

The playground

Let's create three named PassThrough streams connected into a pipeline and observe individual events.

var stream = require('stream');

function observedStream(name) {
  var s = new stream.PassThrough({objectMode: true});

  s.on('error', function(err) { console.log(name + ': ' + err); });
  s.on('data', function(data) { console.log(name + ': ' + data); });
  s.on('finish', function() { console.log(name + ': FINISH'); });
  s.on('end', function() { console.log(name + ': END'); });
  s.on('close', function() { console.log(name + ': CLOSE'); });
  s.on('unpipe', function() { console.log(name + ': UNPIPE'); });

  return s;
}

var s1 = observedStream('S1'),
    s2 = observedStream('S2'),
    s3 = observedStream('S3');

s1.pipe(s2).pipe(s3);

Standard behaviour

Writing to the pipeline is straightforward. We simply get a data event from each strem.

s1.write('Hello');
// S1: Hello
// S2: Hello
// S3: Hello

Let's see what happens when we end the stream? No surprises there either.

s1.end();
// S1: FINISH
// S1: END
// S2: FINISH
// S2: UNPIPE
// S2: END
// S3: FINISH
// S3: UNPIPE
// S3: END

Error handling

Let's try emitting an error (obviously, if you called the s1.end() above, you need to recreate the pipeline first).

s1.emit('error', new Error('bazinga'));
// S1: Error: bazinga

Notice nothing else happens here. You can continue writing to S1 as if nothing happened. The pipeline is not closed.

Things get a little more interesting when there's an error "mid-stream" :)

s2.emit('error', new Error('bazinga'));
// S2: UNPIPE
// S2: ERROR

Notice that Node.js automatically unpipes the S1 stream from S2 but nothing else. I.e. the S1 stream still waits for someone to read its data and the S2 stream is still piped into the S3 and can (theoretically) send data.

This is something you need to handle in your code! One option is to call the end() methods on S1 and S2. Another would be to reconnect S1 and S2 with the pipe() method. Both seem to work, but it all depends on your particular usage scenario.

Credits

  1. This article from Ben Nadel's blog pointed me in the right direction.
  2. This SO question addresses a similar subject from a slightly different perspective. There are some good pointers in the answers as well.
like image 180
frenya Avatar answered Jan 22 '23 16:01

frenya