Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Convert stream into buffer?

Tags:

node.js

How to convert stream into buffer in nodejs? Here is my code to parse a file in post request in express.

app.post('/upload', express.multipart({ defer: true }), function(req, res) { req.form.on('part', function(part) {  //Here I want to convert the streaming part into a buffer. //do something buffer-specific task    var out = fs.createWriteStream('image/' + part.filename);   part.pipe(out); });  req.form.on('close', function() {     res.send('uploaded!');   }); }); 
like image 433
sam100rav Avatar asked Nov 11 '13 12:11

sam100rav


People also ask

Is stream and buffer are same?

So what is the difference between Stream & Buffer? A buffer has a specified, definite length whereas a stream does not. A stream is a sequence of bytes that is read and/or written to, while a buffer is a sequence of bytes that is stored.

What is a buffer in a stream?

A stream buffer (also known as a riparian buffer) is a defined area along a watercourse that is protected from development for the purpose of preserving the natural benefits of riparian ecosystems and reducing hazards risks of such areas (though a mapped Fluvial Hazard Zone would be a better tool for assessing the ...

What is a PassThrough stream?

PassThrough. This Stream is a trivial implementation of a Transform stream that simply passes the input bytes across to the output. This is mainly for testing and some other trivial use cases. Here is an example of Passthrough Stream where it is piping from readable stream to writable stream.

Which of the following buffer class methods returns an uninitialized buffer?

Answers related to “which buffer class methods returns an uninitialized buffer” Create buffers from strings using the Buffer. from() function. Like toString(), you can pass an encoding argument to Buffer.


2 Answers

Instead of piping, you can attach readable and end event handlers to the part stream to read it:

var buffers = []; part.on('readable', function(buffer) {   for (;;) {     let buffer = part.read();     if (!buffer) { break; }     buffers.push(buffer);   } }); part.on('end', function() {   var buffer = Buffer.concat(buffers);   ...do your stuff...    // write to file:   fs.writeFile('image/' + part.filename, buffer, function(err) {     // handle error, return response, etc...   }); }); 

Note: If you instead use data, it will read the entire upload into memory.

You could also create a custom transform stream to transform the incoming data, but that might not be trivial.

like image 198
robertklep Avatar answered Oct 08 '22 14:10

robertklep


You can use the stream-to module, which can convert a readable stream's data into an array or a buffer:

var streamTo = require('stream-to'); req.form.on('part', function (part) {     streamTo.buffer(part, function (err, buffer) {         // Insert your business logic here     }); }); 

If you want a better understanding of what's happening behind the scenes, you can implement the logic yourself, using a Writable stream. As a writable stream implementor, you only have to define one function: the _write method, that will be called every time some data is written to the stream. When the input stream is finished emitting data, the end event will be emitted: we'll then create a buffer using the Buffer.concat method.

var stream = require('stream'); var converter = new stream.Writable();  // We'll store all the data inside this array converter.data = []; converter._write = function (chunk) {     converter.data.push(chunk); };  // Will be emitted when the input stream has ended,  // i.e. no more data will be provided converter.on('finish', function() {     // Create a buffer from all the received chunks     var b = Buffer.concat(this.data);      // Insert your business logic here }); 
like image 37
Paul Mougel Avatar answered Oct 08 '22 16:10

Paul Mougel