The most common method to write data from a list to CSV file is the writerow() method of writer and DictWriter class. Example 1: Creating a CSV file and writing data row-wise into it using writer class.
writerows() This function takes a list of iterables as parameter and writes each item as a comma separated line of items in the file.
By using pandas. DataFrame. to_csv() method you can write/save/export a pandas DataFrame to CSV File. By default to_csv() method export DataFrame to a CSV file with comma delimiter and row index as the first column.
import csv
with open(..., 'wb') as myfile:
wr = csv.writer(myfile, quoting=csv.QUOTE_ALL)
wr.writerow(mylist)
Edit: this only works with python 2.x.
To make it work with python 3.x replace wb
with w
(see this SO answer)
with open(..., 'w', newline='') as myfile:
wr = csv.writer(myfile, quoting=csv.QUOTE_ALL)
wr.writerow(mylist)
Here is a secure version of Alex Martelli's:
import csv
with open('filename', 'wb') as myfile:
wr = csv.writer(myfile, quoting=csv.QUOTE_ALL)
wr.writerow(mylist)
For another approach, you can use DataFrame in pandas: And it can easily dump the data to csv just like the code below:
import pandas
df = pandas.DataFrame(data={"col1": list_1, "col2": list_2})
df.to_csv("./file.csv", sep=',',index=False)
The best option I've found was using the savetxt
from the numpy
module:
import numpy as np
np.savetxt("file_name.csv", data1, delimiter=",", fmt='%s', header=header)
In case you have multiple lists that need to be stacked
np.savetxt("file_name.csv", np.column_stack((data1, data2)), delimiter=",", fmt='%s', header=header)
Use python's csv
module for reading and writing comma or tab-delimited files. The csv module is preferred because it gives you good control over quoting.
For example, here is the worked example for you:
import csv
data = ["value %d" % i for i in range(1,4)]
out = csv.writer(open("myfile.csv","w"), delimiter=',',quoting=csv.QUOTE_ALL)
out.writerow(data)
Produces:
"value 1","value 2","value 3"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With