I have a set of csv headers that I am trying to match with uploads. It's not really working. Not all headers are required -- I just have to match what's in the file.
reader = csv.DictReader(open(PathFile))
headers = reader.fieldnames
for header in sorted(set(headers)):
if (header == 'ip') or (header == 'IP'):
print "IP found in Header"
In this case, IP is not found.
for row in reader:
if row.get('IP'):
print "IP found in Row"
It's not found again. I did search on this site -- there was:
IP = row.get('IP', None)
That did not work either.
This is the file I'm using for testing:
Email, IP, Name, City, State, zip, country, garbage
[email protected], 34.4.34.34,Mr GH, chicago, il ,60601, us,erw ewr
[email protected], 34.45.23.34, Mr 5t,NY,NY,10101, us, er
Use pandas. DataFrame. read_csv(file, header=None) . Then, call pandas. DataFrame. to_csv(file, header=str_list, index=False) with a string list of column labels as str_list to write a header to the CSV file.
In the new dataframe, use the rename function to change any of the column headers you require, Address1, Address2, Address3, Address4. 4. Once the updates are completed then re-export the file with the corrected headers to a folder you wish.
Based on your edit, you need to skip the initial space after the comma.
This should do it:
>>> reader = csv.DictReader(open(PathFile),skipinitialspace=True)
I am not exactly sure what you want to achieve but if you simply want to know if some columns is in CSV, and you are sure that all rows have same columns, and you want to use dict reader use this
s="""col1,col2,col3
ok,ok,ok
hmm,hmm,hmm
cool,cool,cool"""
import csv
reader = csv.DictReader(s.split("\n"))
print reader.fieldnames
for row in reader:
for colName in ['col3', 'col4']:
print "found %s %s"%(colName, colName in row)
break
It outputs
found col3 True
found col4 False
or something like this will work too
reader = csv.reader(s.split("\n"))
columns = reader.next()
for colName in ['col3', 'col4']:
print "found %s %s"%(colName, colName in columns)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With