I'm looking for an efficient way to take a raw sql file and have it executed synchronously against a postgres database, akin to if you ran it through psql
.
I have an sql file which creates all databases, imports data, etc. I need to execute this using node.js but cannot find any module which does this automatically. For the node.js application itself, we use node-postgres ('pg'), knex.js and bookshelf.js. I assume though that pg is best for this.
One alternative I can think of is to read the full file, split it by semicolons, replace newlines with spaces, trim any duplicate space, then feed it into pg one by one in a manner that they're executed sequentially, not asynchronously. I'm a little surprised if this is truly the most efficient way and also if no libraries exist yet to solve this. I'm a little hesitant to jump into it seeing as SQL syntax can itself be a little challenging and I might accidentally mash it up.
Some clarifications in advance:
psql
cannot be used as it's not installed on the target machineThe following are the steps I took. Step 1, install the NPM package called sqlite3 (Read the sqlite3 docs here). sqlite3 helps you to connect to your SQLite database and to run queries. Step 2, In your the JS file where you want to run the SQLs, import/require sqlite3 and fs (No, you don't need to install this one.
Another easiest and most used way to run any SQL file in PostgreSQL is via its SQL shell. Open the SQL shell from the menu bar of Windows 10. Add your server name, database name where you want to import the file, the port number you are currently active on, PostgreSQL username, and password to start using SQL shell.
To import the SQL file, you get the IP address of the AlloyDB primary instance where your database is located and then use the psql tool to import the file into the database. Get the IP address of the AlloyDB primary instance where your database is located by viewing its details.
You can just separate consequent queries with a semicolon when passed to client.query
That works:
var pg = require('pg'); pg.connect('postgres://test:test@localhost/test', function(err, client, done){ client.query('CREATE TABLE test (test VARCHAR(255)); INSERT INTO test VALUES(\'test\') '); done(); });
And consequently, that works too:
var pg = require('pg'); var fs = require('fs'); var sql = fs.readFileSync('init_database.sql').toString(); pg.connect('postgres://test:test@localhost/test', function(err, client, done){ if(err){ console.log('error: ', err); process.exit(1); } client.query(sql, function(err, result){ done(); if(err){ console.log('error: ', err); process.exit(1); } process.exit(0); }); });
I've written the following function which works for my case. It would have been much more simpler if it weren't for:
batch
to manage concurrencyCode snippet:
function processSQLFile(fileName) { // Extract SQL queries from files. Assumes no ';' in the fileNames var queries = fs.readFileSync(fileName).toString() .replace(/(\r\n|\n|\r)/gm," ") // remove newlines .replace(/\s+/g, ' ') // excess white space .split(";") // split into all statements .map(Function.prototype.call, String.prototype.trim) .filter(function(el) {return el.length != 0}); // remove any empty ones // Execute each SQL query sequentially queries.forEach(function(query) { batch.push(function(done) { if (query.indexOf("COPY") === 0) { // COPY - needs special treatment var regexp = /COPY\ (.*)\ FROM\ (.*)\ DELIMITERS/gmi; var matches = regexp.exec(query); var table = matches[1]; var fileName = matches[2]; var copyString = "COPY " + table + " FROM STDIN DELIMITERS ',' CSV HEADER"; var stream = client.copyFrom(copyString); stream.on('close', function () { done(); }); var csvFile = __dirname + '/' + fileName; var str = fs.readFileSync(csvFile); stream.write(str); stream.end(); } else { // Other queries don't need special treatment client.query(query, function(result) { done(); }); } }); }); }
Beware that this would fail if you used semicolons anywhere except to terminate SQL statements.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With