Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Avoid Error too many changes at once in directory

how to avoid the error of FileSystemWatcher in C#?

too many changes at once in directory

I have to detect all changes on a network share. The InternalBufferSize is increased to 8192 * 128

like image 697
DerAbt Avatar asked Mar 20 '13 08:03

DerAbt


1 Answers

There are two things you should do:

  1. Set InternalBufferSize to the maximum supported value (65536). Your attempt to set it to "8192 * 128" is larger than the maximum supported value listed in the documentation, so you may not have increased the buffer size at all.
  2. Queue events from the FileSystemWatcher onto a background thread for processing.

It's the second point here that isn't well understood, and really should be documented on MSDN. Internally, FileSystemWatcher is queuing change events into that internal buffer you set the size of above. Critically however, items are only removed from that buffer after your event handler returns. This means every cycle of overhead your event handlers introduce increases the possibility of the buffer filling up. What you should do is clear the limited queue of the FileSystemWatcher as quickly as possible, and move the events into your own infinite queue, to process at the rate you can handle, or discard if you care to do so, but with some intelligence around it.

Here's basically what I do in my code. First, I start my own dispatcher thread:

Dispatcher changeDispatcher = null;
ManualResetEvent changeDispatcherStarted = new ManualResetEvent(false);
Action changeThreadHandler = () =>
{
    changeDispatcher = Dispatcher.CurrentDispatcher;
    changeDispatcherStarted.Set();
    Dispatcher.Run();
};
new Thread(() => changeThreadHandler()) { IsBackground = true }.Start();
changeDispatcherStarted.WaitOne();

Then I create the watcher. Note the buffer size being set. In my case, I only watch changes in the target directory, not subdirectories:

FileSystemWatcher watcher = new FileSystemWatcher();
watcher.Path = path;
watcher.InternalBufferSize = 64 * 1024;
watcher.IncludeSubdirectories = false;

Now I attach my event handlers, but here I invoke them onto my dispatcher rather than running them synchronously in the watcher thread. Yes, the events will be processed in order by the dispatcher:

watcher.Changed += (sender, e) => changeDispatcher.BeginInvoke(new Action(() => OnChanged(sender, e)));
watcher.Created += (sender, e) => changeDispatcher.BeginInvoke(new Action(() => OnCreated(sender, e)));
watcher.Deleted += (sender, e) => changeDispatcher.BeginInvoke(new Action(() => OnDeleted(sender, e)));
watcher.Renamed += (sender, e) => changeDispatcher.BeginInvoke(new Action(() => OnRenamed(sender, e)));

And finally, after disposing of the FileSystemWatcher (you were doing that, right?), you need to shut down your dispatcher:

watcher.Dispose()
changeDispatcher.BeginInvokeShutdown(DispatcherPriority.Normal);

And that's it. I was getting this problem myself, both in network and local scenarios. After using this approach, I wasn't able to generate this error again, even when hammering out empty files to watched directories as fast as possible. If you did ever manage to somehow exhaust the buffer in this case (which I'm not sure is possible, the API upstream is probably slower), there's still further room for optimization here. As long as your dispatcher is over the "tipping point" though, where the sender can't post the events faster than you can dispatch them, you'll never get a backlog, and hence never blow the buffer. I believe this approach puts you into that safe area.

like image 84
Roger Sanders Avatar answered Oct 21 '22 15:10

Roger Sanders