I'm using Python in order to save the data row by row... but this is extremely slow!
The CSV contains 70million lines, and with my script I can just store 1thousand a second.
This is what my script looks like
reader = csv.reader(open('test_results.csv', 'r'))
for row in reader:
TestResult(type=row[0], name=row[1], result=row[2]).save()
I reckon that for testing I might have to consider MySQL or PostgreSQL.
Any idea or tips? This is the first time I deal with such massive volumes of data. :)
First, from the menu choose tool menu item. Second, choose the database and table that you want to import data then click the Next button. Third, choose CSV as the data source type, choose the CSV file in the Input file field, and choose the ,(comma) option as the Field separator as shown in the picture below.
In the Format list, select CSV. Changing format-specific options. If the csv file is delimited by a character other than a comma or if there are other specifications to the csv files, we can change it in this portion. Click Go to start importing the csv file and the data will be successfully imported into MySQL.
For MySQL imports:
mysqlimport [options] db_name textfile1 [textfile2 ...]
For SQLite3 imports:
ref How to import load a .sql or .csv file into SQLite?
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With