Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

how to import large json file into mongodb using mongoimport?

I am trying to import the large data set json file into mongodb using mongoimport.

mongoimport --db test --collection sam1 --file 1234.json --jsonArray

error:
2014-07-02T15:57:16.406+0530 error: object to insert too large
2014-07-02T15:57:16.406+0530 tried to import 1 objects
like image 943
naveen_sfx Avatar asked Jul 02 '14 17:07

naveen_sfx


People also ask

How do I import data into Mongoimport?

If you have CSV files (or TSV files - they're conceptually the same) to import, use the --type=csv or --type=tsv option to tell mongoimport what format to expect. Also important is to know whether your CSV file has a header row - where the first line doesn't contain data - instead it contains the name for each column.

Can we import JSON in MongoDB?

The process to import JSON into MongoDB depends on the operating system and the programming language you are using. However, the key to importing is to access the MongoDB database and parsing the file that you want to import. You can then go through each document sequentially and insert into MongoDB.

Which parameter do you have to use while importing a CSV file into MongoDB using Mongoimport command?

CSV Files Without Column Headers In the previous example, we used the --headerline parameter to specify that the first line should be used for the field names. If your CSV file doesn't contain a header line, you'll need to use either the --fields parameter or the --fieldFile parameter to specify the field names.


1 Answers

Please try add this option: --batchSize 1

Like:

mongoimport --db test --collection sam1 --file 1234.json --batchSize 1

Data will be parsed and stored into the database batchwise

like image 110
Juliano Galgaro Avatar answered Sep 24 '22 03:09

Juliano Galgaro