I have to read in a file, change a sections of the text here and there, and then write out to the same file.
Currently I do:
f = open(file)
file_str = f.read() # read it in as a string, Not line by line
f.close()
#
# do_actions_on_file_str
#
f = open(file, 'w') # to clear the file
f.write(file_str)
f.close()
But I would imagine that there is a more pythonic approach that yields the same result.
Suggestions?
If you try to read at the same time someone else is writing, that's perfectly OK. The only issue is if you are trying to read a block that the writer is writing at the same time. In that cause, the data you get is unpredictable but you should be able to read.
'r+' opens the file for both reading and writing. On Windows, 'b' appended to the mode opens the file in binary mode, so there are also modes like 'rb', 'wb', and 'r+b'. Also reading then writing works equally well using 'r+b' mode, but you have to use f.
That looks straightforward, and clear already. Any suggestion depends on how big the files are. If not really huge that looks fine. If really large, you could process in chunks.
But you could use a context manager, to avoid the explicit closes.
with open(filename) as f: file_str = f.read() # do stuff with file_str with open(filename, "w") as f: f.write(file_str)
If you work line by line you can use fileinput with inplace mode
import fileinput
for line in fileinput.input(mifile, inplace=1):
print process(line)
if you need to process all the text at once, then your code can be optimized a bit using with
that takes care of closing the file:
with open(myfile) as f:
file_str = f.read()
#
do_actions_on_file_str
#
with open(myfile, 'w') as f:
f.write(file_str)
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With