I have a list of about 1 million addresses, and a function to find their latitudes and longitudes. Since some of the records are improperly formatted (or for whatever reason), sometimes the function is not able to return the latitudes and longitudes of some addresses. This would lead to the for loop breaking. So, for each address whose latitude and longitude is successfully retrieved, I want to write it to the output CSV file. Or, perhaps instead of writing line by line, writing in small chunk sizes would also work. For this, I am using df.to_csv
in "append" mode (mode='a'
) as shown below:
for i in range(len(df)):
place = df['ADDRESS'][i]
try:
lat, lon, res = gmaps_geoencoder(place)
except:
pass
df['Lat'][i] = lat
df['Lon'][i] = lon
df['Result'][i] = res
df.to_csv(output_csv_file,
index=False,
header=False,
mode='a', #append data to csv file
chunksize=chunksize) #size of data to append for each loop
But the problem with this is that, it is printing the whole dataframe for each append. So, for n
lines, it would write the whole dataframe n^2
times. How to fix this?
Use write() to write into a CSV file write(str) to write to file with str as the desired data. Each line should be separated by \n to write line by line.
By using pandas. DataFrame. to_csv() method you can write/save/export a pandas DataFrame to CSV File. By default to_csv() method export DataFrame to a CSV file with comma delimiter and row index as the first column.
pandas DataFrame to CSV with no index can be done by using index=False param of to_csv() method. With this, you can specify ignore index while writing/exporting DataFrame to CSV file.
Pandas DataFrame to_csv() function converts DataFrame into CSV data. We can pass a file object to write the CSV data into a file. Otherwise, the CSV data is returned in the string format.
If you really want to print line by line. (You should not).
for i in range(len(df)):
df.loc[[i]].to_csv(output_csv_file,
index=False,
header=False,
mode='a')
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With