I've had 3 reports now of user's machines crashing while using my software.. the crashes are not related to my program but when they restart the config files my program writes are all corrupt.
There is nothing special to how the files are being written, simply creating a Json representation and dumping it to disk using File.WriteAllText()
// save our contents to the disk
string json = JsonConvert.SerializeObject(objectInfo, Formatting.Indented);
// write the contents
File.WriteAllText(path, json);
I've had a user send me one of the files and the length looks about right (~3kb) but the contents are all 0x00.
According to the post below File.WriteAllText should close the file handle, flushing any unwritten contents to the disk:
In my C# code does the computer wait until output is complete before moving on?
BUT, as pointed out by Alberto in the comments:
System.IO.File.WriteAllText when completes, will flush all the text to the filesystem cache, then, it will be lazily written to the drive.
So I presume what is happening here is that the file is being cleared and initialized with 0x00 but the data is not yet written when the system crashes.
I was thinking of maybe using some sort of temp file so the process would be like this:
I don't think that will solve the problem as I presume Windows will just move the file even though the IO is still pending.
Is there any way I can force the machine to dump that data to disk instead of it deciding when to do it or perhaps a better way to update a file?
UPDATE:
Based on suggestions by @usr, @mikez and @llya luzyanin I've created a new WriteAllText function that performs the write using the following logic:
With that logic, if the final file fails to load, my code an check for a backup file and load that instead
Here is the code:
public static void WriteAllTextWithBackup(string path, string contents)
{
// generate a temp filename
var tempPath = Path.GetTempFileName();
// create the backup name
var backup = path + ".backup";
// delete any existing backups
if (File.Exists(backup))
File.Delete(backup);
// get the bytes
var data = Encoding.UTF8.GetBytes(contents);
// write the data to a temp file
using (var tempFile = File.Create(tempPath, 4096, FileOptions.WriteThrough))
tempFile.Write(data, 0, data.Length);
// replace the contents
File.Replace(tempPath, path, backup);
}
WriteAllText(String, String) Creates a new file, writes the specified string to the file, and then closes the file. If the target file already exists, it is overwritten.
You can use FileStream.Flush
to force the data to disk. Write to a temp file and use File.Replace
to atomically replace the target file.
I believe this is guaranteed to work. File systems give weak guarantees. These guarantees are hardly ever documented and they are complex.
Alternatively, you can use Transactional NTFS if available. It is available for .NET.
FileOptions.WriteThrough
can replace Flush
but you still need the temp file if your data can exceed a single cluster in size.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With