I have one process who's reading from a file (using file.read()
) and one process who's writing to the same file (file.write()
). The problem is it doesn't work - I get no errors but they can't operate at the same time. I've tried making the read and write operations none-blocking and then flushing the stream, as follows:
fcntl.fcntl(file, fcntl.F_SETFL, os.O_NONBLOCK) file.write(msg) file.flush()
Am I completely misunderstanding it? How should one accomplish writing and reading to one file from different processes?
If you try to read at the same time someone else is writing, that's perfectly OK. The only issue is if you are trying to read a block that the writer is writing at the same time. In that cause, the data you get is unpredictable but you should be able to read.
During the actual reading and writing, yes. But multiple processes can open the same file at the same time, then write back. It's up to the actual process to ensure they don't do anything nasty. If your writing the processes, look into flock (file lock).
Python makes it easy to read & write to files with the help of built-in functions.
Yes, the two processes will have their own file table entries.
test1.py
import os f = open('txt.txt', 'a', os.O_NONBLOCK) while 1: f.write('asd') f.flush()
test2.py
import os f = open('txt.txt', 'r', os.O_NONBLOCK) while 1: print f.read(3)
This works fine for me.
Is there a reason to use a common file? Inter-process communication is probably much easier using sockets.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With