I have a folder with a bunch of dbf files I would like to convert to csv. I have tried using a code to just change the extension from .dbf to .csv, and these files open fine when I use Excel, but when I open them in pandas they look like this:
s\t�
0 NaN
1 1 176 1.58400000000e+005-3.385...
This is not what I want, and those characters don't appear in the real file.
How should I read in the dbf file correctly?
Here is my solution that I've been using for years. I have a solution for Python 2.7 and one for Python 3.5 (probably also 3.6).
Python 2.7:
import csv
from dbfpy import dbf
def dbf_to_csv(out_table):#Input a dbf, output a csv
csv_fn = out_table[:-4]+ ".csv" #Set the table as .csv format
with open(csv_fn,'wb') as csvfile: #Create a csv file and write contents from dbf
in_db = dbf.Dbf(out_table)
out_csv = csv.writer(csvfile)
names = []
for field in in_db.header.fields: #Write headers
names.append(field.name)
out_csv.writerow(names)
for rec in in_db: #Write records
out_csv.writerow(rec.fieldData)
in_db.close()
return csv_fn
Python 3.5:
import csv
from dbfread import DBF
def dbf_to_csv(dbf_table_pth):#Input a dbf, output a csv, same name, same path, except extension
csv_fn = dbf_table_pth[:-4]+ ".csv" #Set the csv file name
table = DBF(dbf_table_pth)# table variable is a DBF object
with open(csv_fn, 'w', newline = '') as f:# create a csv file, fill it with dbf content
writer = csv.writer(f)
writer.writerow(table.field_names)# write the column name
for record in table:# write the rows
writer.writerow(list(record.values()))
return csv_fn# return the csv name
You can get dbfpy and dbfread from pip install.
Using my dbf library you could do something like:
import sys
import dbf
for arg in sys.argv[1:]:
dbf.export(arg)
which will create a .csv
file of the same name as each dbf file. If you put that code into a script named dbf2csv.py
you could then call it as
python dbf2csv.py dbfname dbf2name dbf3name ...
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With