Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

How to write to a txt file when multiple processes are using it? C#

Tags:

c#

.net

locking

I have a logger that attempts to write log files. The issue seems to be when there are multiple processes accessing the same log file, the logger fails to write to the log file giving an error of "Another process is using the file...", causing critical information to be lost. I have tried using locks but it does not seem to work since they are separate processes ( I think that is the reason why they do not work). Is there a way to have multiple processes access the file without losing information? If not, are there any other alternatives? Thanks.

like image 326
TestNInja Avatar asked Mar 04 '14 15:03

TestNInja


People also ask

Can 2 processes write to the same file?

no, generally it is not safe to do this! you need to obtain an exclusive write lock for each process -- that implies that all the other processes will have to wait while one process is writing to the file.. the more I/O intensive processes you have, the longer the wait time.

Can multiple processes append to the same file?

Two processes successfully appending to the same file will result in all their bytes in the file in order, but not necessarily contiguously. The caveat is that not all filesystems are POSIX-compatible. Two famous examples are NFS and the Hadoop Distributed File System (HDFS).

What will happen if 2 processes read/write to the same file?

If you try to read at the same time someone else is writing, that's perfectly OK. The only issue is if you are trying to read a block that the writer is writing at the same time. In that cause, the data you get is unpredictable but you should be able to read.

Can two processes simultaneously write to different positions in a single file?

Yes, the two processes will have their own file table entries.


2 Answers

Your options in no particular order:

  1. Have different processes write to different log files. If needed, have these processes roll files at some point (like at midnight), and have another process that takes yesterday's logs and merges them together.
  2. If you own/control your logger infrastructure and must write to the same log file, the look into using named mutex. Named mutexes are system mutexes and can be shared across process boundaries. The code example at the link shows exactly how to share same mutex among processes.
    • As Florian F suggested in the comments, use log4net which can do this (in more than one way, including using named mutex). Their FAQ rightly states that there is performance issue there if you do it that way.
  3. There are ways to have shared read and shared write on the file, and then have ability to lock certain areas of the file. You could have multiple processes try and synchronize who writes into which areas of the file, lock accordingly, and so on. But this is very painful to get right and would recommend highly against it.
  4. Use syslog server.
  5. Use log4net's remote logging capabilities.
like image 69
LB2 Avatar answered Nov 14 '22 22:11

LB2


Log4Net is an option, as mentioned by Florian.

Another option is to use a separate file as a lock. If the lock file is present, other processes perform an active wait, else they create the lock file, write, then delete the lock file.

If you fail to create a lock file, it means another process got there before you and you keep waiting.

like image 23
Mau Avatar answered Nov 14 '22 21:11

Mau