I have a logger that attempts to write log files. The issue seems to be when there are multiple processes accessing the same log file, the logger fails to write to the log file giving an error of "Another process is using the file...", causing critical information to be lost. I have tried using locks but it does not seem to work since they are separate processes ( I think that is the reason why they do not work). Is there a way to have multiple processes access the file without losing information? If not, are there any other alternatives? Thanks.
no, generally it is not safe to do this! you need to obtain an exclusive write lock for each process -- that implies that all the other processes will have to wait while one process is writing to the file.. the more I/O intensive processes you have, the longer the wait time.
Two processes successfully appending to the same file will result in all their bytes in the file in order, but not necessarily contiguously. The caveat is that not all filesystems are POSIX-compatible. Two famous examples are NFS and the Hadoop Distributed File System (HDFS).
If you try to read at the same time someone else is writing, that's perfectly OK. The only issue is if you are trying to read a block that the writer is writing at the same time. In that cause, the data you get is unpredictable but you should be able to read.
Yes, the two processes will have their own file table entries.
Your options in no particular order:
Log4Net is an option, as mentioned by Florian.
Another option is to use a separate file as a lock. If the lock file is present, other processes perform an active wait, else they create the lock file, write, then delete the lock file.
If you fail to create a lock file, it means another process got there before you and you keep waiting.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With