I am trying to learn some basics of event driven programming. So for an exercise I am trying to write a program that reads a large binary file and does something with it but without ever making a blocking call. I have come up with the following:
var fs = require('fs');
var BUFFER_SIZE = 1024;
var path_of_file = "somefile"
fs.open(path_of_file, 'r', (error_opening_file, fd) =>
{
if (error_opening_file)
{
console.log(error_opening_file.message);
return;
}
var buffer = new Buffer(BUFFER_SIZE);
fs.read(fd, buffer, 0, BUFFER_SIZE, 0, (error_reading_file, bytesRead, buffer) =>
{
if (error_reading_file)
{
console.log(error_reading_file.message);
return;
}
// do something e.g. print or write to another file
})
})
I know I need to put a while loop in order to read complete file but in the above code I am reading just the first 1024 bytes of the file and cannot formulate how to continue reading the file without using a blocking loop. How could we do it?
To do that we create a request of type stream using axios. In case the requested file has the . zip extension, we are going to pipe the response through unzip, otherwise we write the stream to the disk using the node. js native fs.
The Buffer class in Node. js is designed to handle raw binary data. Each buffer corresponds to some raw memory allocated outside V8. Buffers act somewhat like arrays of integers, but aren't resizable and have a whole bunch of methods specifically for binary data.
This exception can be solved by increasing the default memory allocated to our program to the required memory by using the following command. Parameters: SPACE_REQD: Pass the increased memory space (in Megabytes).
Use fs.createReadStream instead. This will call your callback over and over again until it has finished reading the file, so you don't have to block.
var fs = require('fs');
var readStream = fs.createReadStream('./test.exe');
readStream.on('data', function (chunk) {
console.log(chunk.length);
})
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With