Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What factors can lead to Win32 error 665 (file system limitation)?

Tags:

c++

winapi

ntfs

I maintain an application that collects data from a datalogger and appends that data to the end of a binary file. The nature of this system is that the file can grow large (> 4 gigabytes) small steps at a time. On of the users of my application has seen cases on his NTFS partition where the attempts to append data fail. The error is being reported as a result of a call to fflush(). When this happens, the return value for GetLastError() is 665 (ERROR_FILE_SYSTEM_LIMITATION). MSDN gives the following description for this error

The requested operation could not be completed due to a file system limitation

A search for this error code on google gives results related to SQL server with VERY large files (tens of gigabytes) but, at present, our file is much smaller. This user has not been able to get the file to grow beyond 10 gigabytes. We can temporarily correct the situation when we do some operation (like copying the file) that forces some sort of rewrite in the file system. Unfortunately, I am not sure what is going on to put us in this condition in the first place. What specific conditions in an NTFS file system can lead to this particular error being reported on a call to fflush()?

like image 939
Jon Trauntvein Avatar asked Sep 06 '11 17:09

Jon Trauntvein


1 Answers

This sounds like you've reached a limit in the fragmentation of the file. In other words, each flush is creating a new extent (fragment) of the file and the filesystem is having a hard time finding a place to keep track of the list of fragments. That would explain why copying the file helps -- it creates a new file with fewer fragments.

Another thing that would probably work is defragmenting the file (using Sysinternals's contig utility you may be able to do so while it's in use). You can also use contig to tell you how many fragments the file has. I'm guessing it's on the order of one million.

If you have to flush the file frequently and can't defrag it, something you can do is simply create the file fairly large in the first place (to allocate the space all at once) and then write to successive bytes of the file rather than append.

If you're brave (and your process has admin access), you can defragment the file yourself with a few API calls: http://msdn.microsoft.com/en-us/library/aa363911(v=VS.85).aspx

like image 170
Gabe Avatar answered Nov 04 '22 17:11

Gabe