The following data stream does not trigger the 'end' event. The 'data' event is triggered and I can see every data row logged to the console.
var AWS = require('aws-sdk');
var ogr2ogr = require('ogr2ogr');
var JSONStream = require('JSONStream');
var S3 = new AWS.S3();
var source = S3.getObject({bucket: ..., key: ...}).createReadStream();
var stream = ogr2ogr(source).format("GeoJSON").stream()
.pipe(JSONStream.parse('features.*'));
stream.on('data', function(data){
console.log(data); // Correctly outputs 70 rows of data.
})
stream.on('end', function(){
console.log('end'); // This code is never executed.
})
stream.on('error', function(err){
console.log(err); // No errors...
})
The process works if I create a write -> read stream after the ogr2ogr transform.
on("end", function () { // This may not been called since we are destroying the stream // the first time "data" event is received console. log("All the data in the file has been read"); }) . on("close", function (err) { console. log("Stream has been destroyed and file has been closed"); });
To fire an event, use the emit() method.
time() method is the console class of Node. js. It is used to starts a timer that is used to compute the time taken by a piece of code or function.
Streams are objects that allows developers to read/write data to and from a source in a continuous manner. There are four main types of streams in Node. js; readable, writable, duplex and transform. Each stream is an eventEmitter instance that emits different events at several intervals.
Take a look at the docs: https://nodejs.org/api/stream.html#stream_event_end
Note that the 'end' event will not fire unless the data is completely consumed. This can be done by switching into flowing mode, or by calling stream.read() repeatedly until you get to the end
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With