Logo Questions Linux Laravel Mysql Ubuntu Git Menu
 

C# SerialPort - Problems mixing ports with different baud rates

I have two devices that I would like to connect over a serial interface, but they have incompatible connections. To get around this problem, I connected them both to my PC and I'm working on a C# program that will route traffic on COM port X to COM port Y and vice versa.

The program connects to two COM ports. In the data received event handler, I read in incoming data and write it to the other COM port. To do this, I have the following code:

    private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
    {
        byte[] data = new byte[1];

        while (inPort.BytesToRead > 0)
        {
            // Read the data
            data[0] = (byte)inPort.ReadByte();

            // Write the data
            if (outPort.IsOpen)
            {
                outPort.Write(data, 0, 1);
            }
        }
    }

That code worked fine as long as the outgoing COM port operated at a higher baud rate than the incoming COM port. If the incoming COM port was faster than the outgoing COM port, I started missing data. I had to correct the code like this:

    private void HandleDataReceived(SerialPort inPort, SerialPort outPort)
    {
        byte[] data = new byte[1];

        while (inPort.BytesToRead > 0)
        {
            // Read the data
            data[0] = (byte)inPort.ReadByte();

            // Write the data
            if (outPort.IsOpen)
            {
                outPort.Write(data, 0, 1);
                while (outPort.BytesToWrite > 0);  //<-- Change to fix problem
            }
        }
    }

I don't understand why I need that fix. I'm new to C# (this is my first program), so I'm wondering if there is something I am missing. The SerialPort defaults to a 2048 byte write buffer and my commands are less than ten bytes. The write buffer should have the ability to buffer the data until it can be written to a slower COM port.

In summary, I'm receiving data on COM X and writing the data to COM Y. COM X is connected at a faster baud rate than COM Y. Why doesn't the buffering in the write buffer handle this difference? Why does it seem that I need to wait for the write buffer to drain to avoid losing data?

Thanks!

* Update *

As noted, this code can very easily run into an overflow condition with large and/or fast incoming data transfers. I should have written more about my data stream. I'm expecting < 10 byte commands (with < 10 byte responses) at 10 Hz. In addition, I'm seeing failures on the first command.

So while I know this code does not scale and is less than optimal, I'm wondering why the 2-4K read/write buffers couldn't even handle the first command. I'm wondering if there is a bug with writing a single byte of data or something with the event handler that I don't understand. Thanks.

* Update *

Here's an example of the failure:

Let's say my command is four bytes: 0x01 0x02 0x3 0x4. The Device on COM X sends the command. I can see the C# program receiving four bytes and sending them on to the device on COM Y. The device on COM Y receives two bytes: 0x01 0x03. I know the device on COM Y is reliable, so I'm wondering how the two bytes were dropped.

By the way, can someone let me know if it's better to just reply to answers with comments or if I should keep editing the original question? Which is more helpful?

like image 409
GrandAdmiral Avatar asked Nov 05 '22 15:11

GrandAdmiral


1 Answers

What you are trying to do is equivalent to drinking from a fire hose. You are relying on the receive buffer to store the water, it isn't going to last long when somebody doesn't turn the tap off. With your workaround, you are making sure that the receive buffer will overflow silently, you probably didn't implement the ErrorReceived event.

To make this work, you'll have to tell the input device to stop sending when the buffer is full. Do that by setting the Handshake property. Set it to Handshake.RequestToSend first. Use XOnXOff next. It depends on the device whether it will use the handshake signals properly.

Use the Read() method to make this a bit more efficient.


Okay, not fire hose. I can think of only one other possibility. A common problem with early UART chip designs, they had a on-chip receive buffer that could store only one byte. Which required the interrupt service routine to read that byte before the next one arrived. If the ISR isn't quick enough, the chip turns on the SerialError.Overrun state and the byte is irretrievably lost.

A workaround for this issue was to artificially put a delay between each transmitted byte, giving the ISR in the device more time to read the byte. Which is what your workaround code does, as a side-effect.

It is not a great explanation, modern chip designs have a FIFO buffer that's at least 8 bytes deep. If there is any truth to this at all, you should see the problem disappear when you lower the baudrate. Also, using Read() instead of ReadByte() should make the problem worse since your Write() call can now transmit more than one byte at a time, eliminating the inter-character delay. To be clear, I'm talking about the output device.

like image 172
Hans Passant Avatar answered Nov 12 '22 21:11

Hans Passant