I am trying to scrape data from a website and I have collected 3 different type of information from the website. I have thousands of records in the 3 list but for simplicity, I am adding a few records.
List1 = ['DealerName']
List2 = ['Person1','Person2']
List3 = ['crisp nori, hot rice, and cold fish','takeout,fancy presentation, piled']
I have to write an output csv file line by line with 3 columns(List1, List2, List3) and the list information for the 3 columns. The 'DealerName' is constant for all records. I am facing trouble because there are commas in List3 which is separating the information in individual columns(different cells). The desired output file should look like this
Thanks for the comments. Based on one of the comments, I made some modifications in the code and tried using the following code but it's not giving me the desired output.
List1 = ['DealerName']
List2 = ['Person1','Person2']
List3 = ['crisp nori, hot rice, and cold fish','takeout,fancy presentation, piled']
Output_File = open("Output.csv", "w")
Output_File.write("List1,List2,List3")
import csv, itertools
rows = itertools.zip_longest([List1, List2, List3])
c = csv.writer(Output_File)
c.writerows(rows)
Output_File.close()
In this particular case (in other words, not in the most general sense), specifying the first element of List1
as the fillvalue
argument when calling itertools.zip_longest()
looks like it would make it work:
import csv, itertools
List1 = ['DealerName']
List2 = ['Person1','Person2']
List3 = ['crisp nori, hot rice, and cold fish', 'takeout,fancy presentation, piled']
with open("Output.csv", "w", newline="") as Output_File:
Output_File.write("List1,List2,List3\n")
writer = csv.writer(Output_File)
rows = itertools.zip_longest(List1, List2, List3, fillvalue=List1[0])
writer.writerows(rows)
Contents of output.csv
file afterward:
List1,List2,List3
DealerName,Person1,"crisp nori, hot rice, and cold fish"
DealerName,Person2,"takeout,fancy presentation, piled"
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With