Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

Why socket reads 0 bytes when more was available

Tags:

c#

.net

sockets

I discovered that the following code loops with 100% CPU usage:

byte[] buffer = new byte[0x10000];
while (true) {
    if (socket.Poll (5000000, SelectMode.SelectRead) == false)
        continue;
    int available = socket.Available;
    if (available == 0)
        return;
    int read = socket.Receive (buffer);
    Console.WriteLine ("Read: " + read + " Available: " + available);
    /* ... */
}

The output is:

Read: 0 Available: 1
Read: 0 Available: 1
Read: 0 Available: 1
Read: 0 Available: 1
Read: 0 Available: 1
...

I was expecting the socket.Receive method to read that remaining byte but it apparently doesn't resulting in my code looping at 100%.

As suggested by jgauffin the documentation reads:

If the remote host shuts down the Socket connection with the Shutdown method, and all available data has been received, the Receive method will complete immediately and return zero bytes.

So reading 0 is kind of expected but only after all data is read, which socket.Available claims is not.

Documentation for Socket.Available only mention a closed connection throwing an exception.

How could I make sure that last byte is read?

Related: this is an answer of how to detect a closed connection that is dependent on the socket.Available being 0 when there is no more data and the connection is closed,

like image 961
hultqvist Avatar asked May 03 '11 11:05

hultqvist


1 Answers

Have you read the documentation?

0 bytes read means that the remote end point have disconnected.

Either use blocking sockets or use the asynchronous methods like BeginReceive(). There is no need for Poll in .Net.

like image 114
jgauffin Avatar answered Oct 22 '22 11:10

jgauffin