Basically I have a file of strings and I need to query against mongo to see if they exist in our database.
So I need to loop through a file and query mongo with a find query using the string from the file and then look at the results and increment some counters I have.
I am trying to do this using a shell script and calling mongo using --eval option but it's running slow. It's been over an hour and hasn't finished a 120,000 queries. I am thinking it would be faster if I could do it in a javascript file so it doesn't have to make the connection for each query.
Thanks for any suggestions!
Karen
You can use the mongoimport command to import CSV files into a collection using the headerline option. Headerline option notifies the mongoimport command of the first line; to be not imported as a document since it contains field names instead of data.
Note that you can use the mongo shell as a full-featured JavaScript interpreter and as a MongoDB client.
Better option would be to use the native shell method cat().
Lets say, your file has following content and situated at /home/desktop/myfile.txt
row1
row2
row3
row4
Following code will let you get the file content as array in string var:
$> var string = cat('home/desktop/myfile.txt');
$> string = string.split('\n');
$> db.records.find({field: {$in: string}});
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With