Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to do Bulk insert using Sequelize and node.js

js + sequelize to insert 280K rows of data using JSON. The JSON is an array of 280K. Is there a way to do bulk insert in chunks. I am seeing that it takes a lot of time to update the data. When i tried to cut down the data to 40K rows it works quick. Am i taking the right approach. Please advice. I am using postgresql as backend.

PNs.bulkCreate(JSON_Small)
        .catch(function(err) {
            console.log('Error ' + err);
        })
        .finally(function(err) {
            console.log('FINISHED  + ' \n +++++++ \n');

        });
like image 570
Uma Maheshwaraa Avatar asked Apr 05 '15 20:04

Uma Maheshwaraa


People also ask

How do I add bulk data to Sequelize?

When you need to insert multiple rows to your SQL database table, you can use the Sequelize bulkCreate() method. The bulkCreate() method allows you to insert multiple records to your database table with a single function call.

How do I insert a Sequelize?

To insert new rows into your SQL table, you need to first create a Sequelize Model object. In Sequelize, a Model is an object that represents your SQL table, giving information to Sequelize about the name, the columns, and their data types.


2 Answers

You can use Sequelize's built in bulkCreate method to achieve this.

User.bulkCreate([
  { username: 'barfooz', isAdmin: true },
  { username: 'foo', isAdmin: true },
  { username: 'bar', isAdmin: false }
]).then(() => { // Notice: There are no arguments here, as of right now you'll have to...
  return User.findAll();
}).then(users => {
  console.log(users) // ... in order to get the array of user objects
})

Sequelize | Bulk Create and Update

like image 143
Clement Avatar answered Sep 21 '22 18:09

Clement


I utilized the cargo utility of the async library to load in up to 1000 rows at a time. See the following code for loading a csv into a database:

var fs = require('fs'),
    async = require('async'),
    csv = require('csv');

var input = fs.createReadStream(filename);
var parser = csv.parse({
  columns: true,
  relax: true
});
var inserter = async.cargo(function(tasks, inserterCallback) {
    model.bulkCreate(tasks).then(function() {
        inserterCallback(); 
      }
    );
  },
  1000
);
parser.on('readable', function () {
  while(line = parser.read()) {
    inserter.push(line);
  }
});
parser.on('end', function (count) {
  inserter.drain = function() {
    doneLoadingCallback();
  }
});
input.pipe(parser);
like image 40
Evan Siroky Avatar answered Sep 18 '22 18:09

Evan Siroky