I have a web app which connects to a server using a TCP connection and reads a binary document which it then writes to its response object. In other words it's transferring a file from a backend server using a custom protocol and returning that file to its client through HTTP.
The server sends a status code and a mime type, which I read successfully and then writes the contents of the file and closes the socket. This seems to work fine.
The client (a C# web app), reads the data:
private NetworkStream stream_;
public void WriteDocument(HttpResponse response)
{
while (stream_.DataAvailable)
{
const int bufsize = 4 * 1024;
byte[] buffer = new byte[bufsize];
int nbytes = stream_.Read(buffer, 0, bufsize);
if (nbytes > 0)
{
if (nbytes < bufsize)
Array.Resize<byte>(ref buffer, nbytes);
response.BinaryWrite(buffer);
}
}
response.End();
}
This seems to always exit the read loop before all the data has arrived. What am I doing wrong?
I would use the OutputStream directly with a general-purpose function. With the Stream, you can control Flush.
public void WriteDocument(HttpResponse response) {
StreamCopy(response.OutputStream, stream_);
response.End();
}
public static void StreamCopy(Stream dest, Stream src) {
byte[] buffer = new byte[4 * 1024];
int n = 1;
while (n > 0) {
n = src.Read(buffer, 0, buffer.Length);
dest.Write(buffer, 0, n);
}
dest.Flush();
}
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With