I'm using the .Net SerialPort class in C# to read bytes from a port. On receipt of a DataReceived event I check the serial port to see if bytes are available to be read. However, even if bytes are available, the port can take over half a second to read a single byte. Code is roughtly as follows:
...
while(Port.BytesToRead > 0)
{
StopWatch.Restart();
Int32 BytesRead = Port.Read(Read, 0, 1);
StopWatch.Stop();
if (StopWatch.ElapsedMilliseconds > 100)
{
// Record the time. The stopwatch code
// was only added after performance issues were observed.
}
}
Note that the time which I've measured is not the time to read all bytes, rather the time to read a single byte. Frequently I'll receive a DataReceived event and have to wait 0.5 seconds for the first byte to be read.
I've actually tried setting the Port's ReadTimeout property to something smaller to prevent it from sitting there indefinitely, but this property seems to be ignored.
Any help greatly appreciated.
Turns out that running connected to the Debugger was causing the problem. Running outside of the debugger the maximum time recorded to read a byte was around 20ms, as opposed to up to 700ms when running within (no breakpoints, conditional or otherwise enabled).
Bit of a red herring, as the real cause of the comms problem when running a release build probably lay elsewhere.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With