I'm trying to read a binary file with fs createReadStream
. Assumed that we know the "misunderstanding" of binary
and latin1
as value for the encoding
option, and that by defaults using toString
on the data
chunk will use utf-8
, I have tried to user a Buffer
and buffer concatenation that is the .concat
function in this way:
var readStream = fs.createReadStream( graphPath );
var data;
readStream.on('data', (chunk) => {
var b = new Buffer (chunk.length);
if(!data) data = new Buffer (chunk.length);
else data = Buffer.concat([data, b]);
})
readStream.on('end', () => {
console.log( "type is", typeof (data ) );
console.log("read graph %d",data.length);data.buffer);
});
so that to append the new chunk to the data. In this way the typeof
object in data
is object
, but it seems the encoding it is still not a binary format (i.e. it is not application/octet-stream charset=binary
to be clear in terms of content type.)
If I just do data+=chunk
the output type will be string
.
If the data coming out of the file is already 'binary'
encoding, then set the readStream
encoding to 'binary'
using setEncoding()
and concatenate all of the chunks into a single string and return that. setEncoding()
supports any of the encodings implemented in Node per Buffer docs for Encodings.
const getBinary = (graphPath, asBuffer = false, cb) => {
let readStream = fs.createReadStream(graphPath)
let data = ''
// set stream encoding to binary so chunks are kept in binary
readStream.setEncoding('binary')
readStream.once('error', err => {
return cb(err)
})
readStream.on('data', chunk => (data += chunk))
readStream.on('end', () => {
// If you need the binary data as a Buffer
// create one from data chunks
return cb(null, asBuffer ? Buffer.from(data, 'binary') : data)
})
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With