I am getting this error thrown when trying to make a file. It is being designed to take a created .csv file and put it into a plain text file.
I would like it to create a new file after it has been run with the date and time stamp but I seem to get the Errno 22 when trying to generate the file.
Any ideas?
import csv
import time
f = open(raw_input('Enter file name: '),"r")
saveFile = open ('Bursarcodes_'+time.strftime("%x")+ '_'+time.strftime("%X")+
'.txt', 'w+')
csv_f = csv.reader(f)
for row in csv_f:
saveFile.write( 'insert into bursarcode_lookup(bursarcode, note_id)' +
' values (\'' + row[0] + '\', ' + row[1] + ')\n')
f.close()
saveFile.close()
You cannot have slashes (/
) and colons (:
, but allowed in Unix) in your file name, but they are exactly what strftime
generates in its output.
Python tries to help you, it says:
No such file or directory: 'Bursarcodes_01/09/15_19:59:24.txt'
Replace time.strftime("%x")
with this:
time.strftime("%x").replace('/', '.')
...and time.strftime("%X")
with this:
time.strftime("%X").replace(':', '_')
A cleaned-up and extended version:
import csv
import sys
import time
def make_output_fname():
# Thanks to @Andrew:
return time.strftime("Bursarcodes_%x_%X.txt").replace("/", "-").replace(":", "-")
def main(csv_fname=None, outfname=None, *args):
if not csv_fname:
# first arg not given - prompt for filename
csv_fname = raw_input("Enter .csv file name: ")
if not outfname:
# second arg not given - use serialized filename
outfname = make_output_fname()
with open(csv_fname) as inf, open(outfname, "w") as outf:
incsv = csv.reader(inf)
for row in incsv:
outf.write(
"insert into bursarcode_lookup(bursarcode, note_id) values ('{0}', '{1}')\n"
.format(*row)
)
if __name__=="__main__":
# pass any command-line arguments to main()
main(*sys.argv[1:])
You can now run it from the command-line as well.
Note that if any data items in your csv file contain unescaped single-quotes ('
) you will get invalid sql.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With