Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Read a text file using Node.js?

People also ask

How do I read a text file in node JS?

We can use the 'fs' module to deal with the reading of file. The fs. readFile() and fs. readFileSync() methods are used for the reading files.

How do I read a text file line by line in node JS?

Method 1: Using the Readline Module: Readline is a native module of Node. js, it was developed specifically for reading the content line by line from any readable stream. It can be used to read data from the command line. const readline = require('readline');

How do I read a text file in TypeScript?

Use the readFileSync() method to read a file's contents in TypeScript, e.g. readFileSync(join(__dirname, 'example. txt'), 'utf-8') . The method takes the path and encoding as parameters and returns the contents of the specified file.


You'll want to use the process.argv array to access the command-line arguments to get the filename and the FileSystem module (fs) to read the file. For example:

// Make sure we got a filename on the command line.
if (process.argv.length < 3) {
  console.log('Usage: node ' + process.argv[1] + ' FILENAME');
  process.exit(1);
}
// Read the file and print its contents.
var fs = require('fs')
  , filename = process.argv[2];
fs.readFile(filename, 'utf8', function(err, data) {
  if (err) throw err;
  console.log('OK: ' + filename);
  console.log(data)
});

To break that down a little for you process.argv will usually have length two, the zeroth item being the "node" interpreter and the first being the script that node is currently running, items after that were passed on the command line. Once you've pulled a filename from argv then you can use the filesystem functions to read the file and do whatever you want with its contents. Sample usage would look like this:

$ node ./cat.js file.txt
OK: file.txt
This is file.txt!

[Edit] As @wtfcoder mentions, using the "fs.readFile()" method might not be the best idea because it will buffer the entire contents of the file before yielding it to the callback function. This buffering could potentially use lots of memory but, more importantly, it does not take advantage of one of the core features of node.js - asynchronous, evented I/O.

The "node" way to process a large file (or any file, really) would be to use fs.read() and process each available chunk as it is available from the operating system. However, reading the file as such requires you to do your own (possibly) incremental parsing/processing of the file and some amount of buffering might be inevitable.


Usign fs with node.

var fs = require('fs');

try {  
    var data = fs.readFileSync('file.txt', 'utf8');
    console.log(data.toString());    
} catch(e) {
    console.log('Error:', e.stack);
}

IMHO, fs.readFile() should be avoided because it loads ALL the file in memory and it won't call the callback until all the file has been read.

The easiest way to read a text file is to read it line by line. I recommend a BufferedReader:

new BufferedReader ("file", { encoding: "utf8" })
    .on ("error", function (error){
        console.log ("error: " + error);
    })
    .on ("line", function (line){
        console.log ("line: " + line);
    })
    .on ("end", function (){
        console.log ("EOF");
    })
    .read ();

For complex data structures like .properties or json files you need to use a parser (internally it should also use a buffered reader).


You can use readstream and pipe to read the file line by line without read all the file into memory one time.

var fs = require('fs'),
    es = require('event-stream'),
    os = require('os');

var s = fs.createReadStream(path)
    .pipe(es.split())
    .pipe(es.mapSync(function(line) {
        //pause the readstream
        s.pause();
        console.log("line:", line);
        s.resume();
    })
    .on('error', function(err) {
        console.log('Error:', err);
    })
    .on('end', function() {
        console.log('Finish reading.');
    })
);