I understand from the MSDN docs that the event DataReceived will not necessarily fire once per byte.
But does anyone know what exactly is the mechanism that causes the event to fire?
Does the receipt of each byte restart a timer that has to reach, say 10 ms between bytes, before the event fires?
I ask because I'm trying to write an app that reads XML data coming in from a serial port.
Because my laptop has no serial ports, I use a virtual serial port emulator. (I know, I know--I can't do anything about it ATM).
When I pass data through the emulated port to my app, the event fires once for each XML record (about 1500 bytes). Perfect. But when a colleague at another office tries it with two computers connected by an actual cable, the DataReceived event fires repeatedly, after every 10 or so bytes of XML, which totally throws off the app.
DataReceived can fire at any time one or more bytes are ready to read. Exactly when it is fired depends on the OS and drivers, and also there will be a small delay between the data being received and the event being fired in .NET.
You shouldn't rely on the timing of DataReceived events for control flow.
Instead, parse the underlying protocol and if you haven't received a complete message, wait for more. If you receive more than one message, make sure to keep the left overs from parsing the first message because they will be the start of the next message.
If you love us? You can donate to us via Paypal or buy me a coffee so we can maintain and grow! Thank you!
Donate Us With