Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to write incrementally to a text file and flush output

Tags:

file

node.js

My Node.js program - which is an ordinary command line program that by and large doesn't do anything operationally unusual, nothing system-specific or asynchronous or anything like that - needs to write messages to a file from time to time, and then it will be interrupted with ^C and it needs the contents of the file to still be there.

I've tried using fs.createWriteStream but that just ends up with a 0-byte file. (The file does contain text if the program ends by running off the end of the main file, but that's not the scenario I have.)

I've tried using winston but that ends up not creating the file at all. (The file does contain text if the program ends by running off the end of the main file, but that's not the scenario I have.)

And fs.writeFile works perfectly when you have all the text you want to write up front, but doesn't seem to support appending a line at a time.

What is the recommended way to do this?

Edit: specific code I've tried:

var fs = require('fs')

var log = fs.createWriteStream('test.log')
for (var i = 0; i < 1000000; i++) {
    console.log(i)
    log.write(i + '\n')
}

Run for a few seconds, hit ^C, leaves a 0-byte file.

like image 325
rwallace Avatar asked Apr 26 '17 19:04

rwallace


3 Answers

Turns out Node provides a lower level file I/O API that seems to work fine!

var fs = require('fs')

var log = fs.openSync('test.log', 'w')
for (var i = 0; i < 100000; i++) {
    console.log(i)
    fs.writeSync(log, i + '\n')
}
like image 170
rwallace Avatar answered Nov 06 '22 06:11

rwallace


NodeJS doesn't work in the traditional way. It uses a single thread, so by running a large loop and doing I/O inside, you aren't giving it a chance (i.e. releasing the CPU) to do other async operations for eg: flushing memory buffer to actual file.

The logic must be - do one write, then pass your function (which invokes the write) as a callback to process.nextTick or as callback to the write stream's drain event (if buffer was full during last write).

Here's a quick and dirty version which does what you need. Notice that there are no long-running loops or other CPU blockage, instead I schedule my subsequent writes for future and return quickly, momentarily freeing up the CPU for other things.

var fs = require('fs')

var log = fs.createWriteStream('test.log');
var i = 0;

function my_write() {
if (i++ < 1000000)
{
    var res = log.write("" + i + "\r\n");
    if (!res) {
        log.on('drain',my_write);
    } else {
        process.nextTick(my_write);
    }
    console.log("Done" + i + " " + res + "\r\n");
}
}

my_write();
like image 5
Vasan Avatar answered Nov 06 '22 05:11

Vasan


This function might also be helpful.

/**
 * Write `data` to a `stream`. if the buffer is full will block
 * until it's flushed and ready to be written again.
 * [see](https://nodejs.org/api/stream.html#stream_writable_write_chunk_encoding_callback)
 */
export function write(data, stream) {
  return new Promise((resolve, reject) => {
    if (stream.write(data)) {
      process.nextTick(resolve);
    } else {
      stream.once("drain", () => {
        stream.off("error", reject);
        resolve();
      });
      stream.once("error", reject);
    }
  });
}
like image 3
Safareli Avatar answered Nov 06 '22 05:11

Safareli