Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

What are buffer sizes of Process.Start for stdout and stderr?

Tags:

I am trying to find out that information. Apparently Microsoft has not provided code to read processes stdout and stderr reliably without deadlocks, exceptions and other issues.

Deadlock problem

http://www.codeducky.org/process-handling-net/

A less-obvious problem is that of deadlocking. All three process streams (in, out, and error) are finite in how much content they can buffer. If the internal buffer fills up, then whoever is writing to the stream will block. In this code, for example, we don’t read from the out and error streams until after the process has exited. That means that we could find ourselves in a case where external process exhausts it’s error buffer. In that case, external process would block on writing to standard error, while our .NET app is blocked reading to the end of standard out. Thus, we’ve found ourselves in a deadlock!


There is no reliable code on the internet

There is a 265 times upvoted answer, but I am not going to use it because it suffers from ObjectDisposeException and requires timeouts:
ProcessStartInfo hanging on "WaitForExit"? Why?

It was probably upvoted before it was discovered it can lead to ObjectDisposeException.


What are chances of experiencing deadlock?

I want to know what are the chances of getting into deadlock situation. For that I neeed to know the buffer sizes for stderr and stdout under windows 7.

How to find them out? I thought I could cat several files to see what approx file size will cause problems, but there is no cat under windows 7. I even tried using git cat-file but it has bad documentation with no usage examples and no one has answered a question regarding that: https://stackoverflow.com/questions/43203902/how-to-git-cat-file-a-file-in-current-dir

like image 921
Marko Avlijaš Avatar asked Apr 04 '17 10:04

Marko Avlijaš


1 Answers

John is correct in his comment, it's hardcoded to 4096 bytes.

https://github.com/Microsoft/referencesource/blob/master/System/services/monitoring/system/diagnosticts/Process.cs

    if (startInfo.RedirectStandardInput) {
            standardInput = new StreamWriter(new FileStream(standardInputWritePipeHandle, FileAccess.Write, 4096, false), Console.InputEncoding, 4096);
            standardInput.AutoFlush = true;
        }
        if (startInfo.RedirectStandardOutput) {
            Encoding enc = (startInfo.StandardOutputEncoding != null) ? startInfo.StandardOutputEncoding : Console.OutputEncoding;
            standardOutput = new StreamReader(new FileStream(standardOutputReadPipeHandle, FileAccess.Read, 4096, false), enc, true, 4096);
        }
        if (startInfo.RedirectStandardError) {
            Encoding enc = (startInfo.StandardErrorEncoding != null) ? startInfo.StandardErrorEncoding : Console.OutputEncoding;
            standardError = new StreamReader(new FileStream(standardErrorReadPipeHandle, FileAccess.Read, 4096, false), enc, true, 4096);

}

I have tested this by running cmd.exe type path\to\file.html command, but it printed reduced output, so I wrote a one line ruby script to open a file and print it to stdout. It failed with 4373 bytes file and worked with 3007 bytes file. It just hangs, deadlocks.

Really pathetic process handling from Microsoft IMO.

like image 155
Marko Avlijaš Avatar answered Sep 21 '22 10:09

Marko Avlijaš