I'm trying to use mongoimport to upsert data with string values in _id. Since the ids look like integers (even though they're in quotes), mongoimport treats them as integers and creates new records instead of upserting the existing records.
Command I'm running:
mongoimport --host localhost --db database --collection my_collection --type csv --file mydata.csv --headerline --upsert
Example data in mydata.csv:
{ "_id" : "0364", someField: "value" }
The result would be for mongo to insert a record like this: { "_id" : 364, someField: "value" }
instead of updating the record with _id "0364"
.
Does anyone know how to make it treat the _id
as strings?
Things that don't work:
{ "_id" : "0364" + "", someField: "value" }
If you have CSV files (or TSV files - they're conceptually the same) to import, use the --type=csv or --type=tsv option to tell mongoimport what format to expect. Also important is to know whether your CSV file has a header row - where the first line doesn't contain data - instead it contains the name for each column.
CSV Files Without Column Headers In the previous example, we used the --headerline parameter to specify that the first line should be used for the field names. If your CSV file doesn't contain a header line, you'll need to use either the --fields parameter or the --fieldFile parameter to specify the field names.
Unfortunately there is not now a way to force number-like strings to be interpreted as strings:
https://jira.mongodb.org/browse/SERVER-3731
You could write a script in Python or some other language with which you're comfortable, along the lines of:
import csv, pymongo
connection = pymongo.Connection()
collection = connection.mydatabase.mycollection
reader = csv.DictReader(open('myfile.csv'))
for line in reader:
print '_id', line['_id']
upsert_fields = {
'_id': line['_id'],
'my_other_upsert_field': line['my_other_upsert_field']}
collection.update(upsert_fields, line, upsert=True, safe=True)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With